fbpx

Apple Gets Into A.I…Finally

Despite being arguably the tech buzz word of the year, A.I hasn’t yet seen much contribution from tech giant, Apple… that is until now. 

According to Wayne Ma at The Information, Apple created a team four years ago, long before the current generative AI hype cycle, to develop large-language models. In other words, Apple may not be as much of an LLM laggard as people thought. This isn’t surprising seeing that Apple has almost always stayed ahead of the curve, helping create the foundation for countless mobile applications like Uber and TikTok

Apple’s LLM team today is just 16 people strong, with a budget for training Apple’s most advanced models in the millions of dollars per day. (To put that in perspective, OpenAI CEO Sam Altman has said that it cost more than $100 million to train its most advanced model, GPT-4.)

Apple may not have the same number of AI PhDs as its rivals, but it shouldn’t be underestimated. First, Apple takes the distribution advantage of incumbents to a whole new level, as it has more than 2 billion active devices globally. Wayne reports that Apple plans to incorporate language models into Siri to allow users to automate complex tasks using voice commands. Of course, Siri has been the butt of jokes for its incompetence for years, possibly complicating those plans.

Apple itself has historically taken advantage of running AI on the edge to combat consumers’ privacy concerns. For instance, face detection AI models run fully on iPhones rather than on servers.

Today, multi-billion parameter language models like Apple’s Ajax GPT, which has more than 200 billion parameters, aren’t capable of running on devices due to their high compute and memory needs—but that could soon change. Developers have already gotten smaller versions of existing models, like Google’s PaLM 2 and Meta Platforms’ Llama 2, working on phones and laptops, and researchers at universities and startups like OctoML are also tackling this problem.

The strong presence of former Google staffers in Apple’s AI teams has led to a preference for Google’s cloud services and specialized AI chips called tensor processing units. Wayne discovered, for instance, that Apple’s model-training software is optimized for TPUs.

But, if Apple’s usage of language models (and therefore, its need for TPUs) explodes, Google could get a much-needed boost as it takes on other cloud providers like Amazon Web Services and Microsoft.

Apples’ potential weakness in the A.I game is its commitment to data privacy which may put it at a disadvantage versus competitors like OpenAI, which have shown a greater willingness to push boundaries in gathering public data to train AI models, though the boundary-pushers are going to have to stomach some copyright court battles over their practices. 

Leave a Reply

Related Posts