Timely Topic: AI Smartphones Could Offer Hackers a Treasure Trove of Data

Share

06.27.2024

cell phone with icons for AI apps

Smartphones keep getting smarter with artificial intelligence (AI) features that can transcribe voice memos, search email for concert tickets or translate a conversation with someone who speaks another language in real time.

But the innovative technology comes with new privacy risks.

“Although most people might feel concerned about the AI aspects of this new technology, I think the real concern is the information-access part,” said Dr. Kevin Hamlen, the Louis Beecherl Jr. Distinguished Professor of computer science and executive director of the Cyber Security Research and Education Institute in the Erik Jonsson School of Engineering and Computer Science at The University of Texas at Dallas.

Smartphones store different pieces of information on multiple apps. AI must access as much data as possible to perform various functions, such as finding concert tickets.

“This means it will be a prime target for criminals wanting to steal private information,” said Hamlen. “Hackers will think, ‘Why break into 50 apps to steal a victim’s information when I can just break into one that has access to it all?’”

Criminals who break into smartphones can access users’ information, including a user’s address, contacts, frequently visited places and photos, which could then be used in ransomware attacks, sold over the dark web, or mined for information that helps them guess your passwords to break into other accounts.

Generative AI increasingly is offered on mobile devices, including the Google Pixel 8 Pro and the Samsung Galaxy S24 Ultra. Apple recently announced that its AI, Apple Intelligence, will be available on the iPhone 15, iPhone 15 Pro and iPhone 15 Pro Max.

To protect user data, Apple uses end-to-end encryption, which Hamlen said is highly effective for protecting data “in flight” as it travels to its destination. Even that data, however, is vulnerable before encryption or after decryption, he said. Attacks called remote code execution, or RCE, exploit this vulnerability by hijacking phone apps to steal data before it is encrypted and sent to the cloud.

To protect information processed in the cloud, Apple developed Private Cloud Compute, which ensures the information will not be accessible to anyone other than the user. In addition, the company said it will not store data.

professor writing equations on a clear boardDr. Kevin Hamlen is the Louis Beecherl Jr. Distinguished Professor of computer science and executive director of the Cyber Security Research and Education Institute at UT Dallas.

“Private Cloud Compute is a good step toward minimizing access to that centralized treasure trove of information, but it still means that any breaches will be more damaging,” Hamlen said. “Users should understand that they’re inevitably trading some of their security for ease of use with this new feature.”

For example, Google advises users that human reviewers read, annotate and process users’ conversations in the Gemini app, an AI assistant, to improve its products. Google suggests that users avoid entering confidential information that they would not want a reviewer to see.

In addition to attacks on a phone or the cloud, the AI model itself could become a target.

“An AI model that has learned your contacts, your scheduling habits, your email history, etc. is incredibly useful to adversaries,” he said. “If hackers can find a way to steal that model, they effectively gain access to the data from which it was trained.”

This type of personal information could fuel additional attacks.

“Such a model could be abused to generate highly convincing spear-phishing [hyper-personalized] attacks against the victim user or even the victim’s contacts,” Hamlen said. “Such risks are why Apple has been taking extensive steps to try to protect this new feature from malicious intrusions.”

Hamlen said he protects his own privacy by being selective about what data – including email accounts that include sensitive data – he puts on his phone.

–Kim Horner

 
Note to journalists: Dr. Kevin Hamlen is available for news media interviews. Contact Kim Horner, 972-883-4463, kim.horner@utdallas.edu.

Tags: , ,