Advertisment

Apple’s AI gambit: Will it redefine privacy in tech?

Apple's bold move into Gen AI with strict privacy measures sets a new standard, reshaping tech ethics and user trust. Discover how Apple Intelligence prioritizes on-device data privacy, impacting the global AI economy and future tech industry.

author-image
Voice&Data Bureau
New Update
image

Apple’s bold move into Gen AI with stringent privacy measures sets a new standard, potentially reshaping global tech ethics and user trust.

Advertisment

Back in November 2022, an application called ChatGPT by a little-known organisation named OpenAI took the world by storm. Now, more than 18 months since the sunrise moment for generative artificial intelligence (Gen AI), the technology is on the verge of bringing billions of dollars into the global AI economy. However, one company that was missing from this story was the world’s third-most valuable enterprise—Apple. On June 10, this changed, potentially reshaping the global AI economy.

In a pre-recorded keynote, Apple Chief Executive Officer Tim Cook and Senior VP of software Craig Federighi unveiled Apple Intelligence. As part of this suite, Apple would offer all of the Gen AI features that the likes of Google and Samsung have already talked about at length since October last year. The novel bit was not the features that Apple’s presentation excitedly remarked about, but a unique approach that could put Apple in a position of advantage despite being late to the AI party. This, to be sure, is data privacy.

Adding context is crucial to a ‘good’ Gen AI experience, making chatbot responses exponentially better, personalised, and more effective.

Advertisment

Mind the data

Apple’s announcement encompassed everything Google’s Pixel 8 and Samsung’s Galaxy S24 series had already spoken about—and a few extra toys to play around with.With the launch of its next-generation mobile platform, iOS 18, and on the latest ‘Pro’ iPhones, Apple AI will enable users to transcribe and record phone calls in real-time, auto-generate emails and notes, create images through a ‘playground’ app or within other Apple apps, make custom emoticons, and organise notificationsin a particular order for a user based on their personal usage preferences.

Federighi, on stage at the Worldwide Developer Conference, reaffirmed that all of this is possible because of ‘personal context,’ which its AI algorithms will develop over time. He emphasised that this personal context will be stored solely on-device, with no remote Apple server ever accessing the unique metadata of each iPhone user. This user-centric approach ensures that user data remains safe, enhancing the sense of value and ownership.

Advertisment

In hindsight, the move was crucial. Since BlackBerry’s exit from the handset market, Apple has been considered a stronger option in terms of user privacy. However, Gen AI has faced significant criticism for being heavy consumers of private user data without clear disclosure. Apple’s announcement aims to address this critical concern directly.

Central to Apple’s strategy is its commitment to running AI features natively on its devices, seamlessly integrating them into the core Apple experience.

The key question, however, remains: did it address the issue well enough?

Advertisment

How does personalised Gen AI work?

Personal data is crucial to a ‘good’ Gen AI experience. While ‘good’ is subjective, consider this: chatbots on most platforms have been rudimentary. Often, customers are repelled by the boxed, stereotyped answers that chatbots are pre-trained to offer by their underlying algorithms. Adding personal context to these algorithms can make chatbot responses exponentially better, more personalised, and more effective.

In Apple parlance, personal context involves collecting on-device metadata on how users operate their smartphones. This can include wake-up time, types of apps used at specific hours, prioritised notifications, Siri conversation context, and recognising contact names in tandem with the photo library.

Advertisment

All of these data points towards a key concern. Sharing these data points with a cloud library and linking them with third-party sources could be devastating for user privacy.

image

How Apple handles it

Advertisment

To avoid data-sharing concerns, Apple asserts that no personal metadata is ever shared with remote cloud servers. The company also underlines that the privacy claims of data sharing will be opened up for third-party, independent cyber privacy audits. It also stated that all metadata would be exclusively used in locally run AI models—and would not tap larger databases on cloud platforms.

Federighi confirmed that Apple utilises a three billion-parameter ‘small’ language model for general queries across iPhones, iPads, and Macs. This model is tailored for general-purpose queries and use cases Apple has explicitly designed for the iPhone. Third-party apps will also contribute to the application metadata collected by the algorithm in the long run, and the holistic user signature generated by each Apple user is not shared outside of the siloed devices in question.

Cybersecurity experts question Apple’s claims, something that only independent audits will help resolve. Further, an extended version of Apple Intelligence involves users to voluntarily tap OpenAI’s ChatGPT, based on the latest GPT-4o AI model. Towards this, Appleclaims its deal with the Sam Altman-led company will ensure no user data is shared with third-party entities. However, it isunclear whether independent privacy audits on Apple’s platform will allow experts to probe how OpenAI handles Apple users’ data once they are shared through any device.

Advertisment

An extended version of Apple Intelligence involves users to voluntarily tap OpenAI’s ChatGPT, based on the latest GPT-4o AI model.

Further, Apple is yet to clarify what would happen if an iPhone user with a pre-existing ChatGPT account signed up through their Apple ID, also looks to integrate their smartphone’s Siri with ChatGPT.If strict data privacy siloes are maintained, this could be a tricky affair and operationally complex, even for OpenAI.

The impact of Apple’s move

Industry observers anticipate that Apple’s privacy initiative will set a precedent for the entire sector. So far, both Google and Samsung have indicated that they collect unspecified amounts of anonymised metadata from devices, which falls short of reassuring users compared to Apple’s stringent privacy standards.

Apple’s global footprint is substantial, encompassing an estimated 1.3 billion iPhones and more than 1.5 billion users, including iPads and Macs. With this scale, Apple is well-positioned to expand the adoption of Gen AI solutions beyond what standalone services have achieved. Central to this strategy is Apple’s commitment to running AI features natively on its devices, seamlessly integrating them into the core Apple experience.

Whether other platforms will follow a similar path in handling Gen AI data remains uncertain and will unfold over time.

By Vernika Awal

feedbackvnd@cybermedia.co.in

Advertisment