Apple may have delayed the Siri upgrade for fear of jailbreaks
Appleâs work on AI-enhancements for Siri has been officially delayed (itâs now slated to roll out âin the coming yearâ) and one developer thinks they know why â the smarter and more personalized Siri is, the more dangerous it can be if something goes wrong.
Simon Willison, the developer of the data analysis tool Dataset, points the finger at prompt injections. AIs are typically restricted by their parent companies who impose certain rules on them. However, itâs possible to âjailbreakâ the AI by talking it into breaking those rules. This is done with so-called âprompt injectionsâ.
As a simple example, an AI model may have been instructed to refuse to answer questions about doing something illegal. But what if you ask the AI to write you a poem about hotwiring a car? Writing poems isnât illegal, right?
This is an issue that all companies offering AI chatbots face and they have gotten better at blocking obvious jailbreaks, but itâs not a solved problem yet. Worse, jailbreaking Siri can have much worse consequences than most chatbots because of what it knows about you and what it can do. Apple spokeswoman Jacqueline Roy described Siri as follows:
âWeâve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps.â
Apple, undoubtedly, put rules in place to prevent Siri from accidentally revealing your private data. But what if a prompt injection can get it to do it anyway? The âability to take action for youâ can be exploited too, so itâs vital for a company that is as privacy and security conscious as Apple to make sure that Siri canât be jailbroken. And, apparently, this is going to take a while.