Apple Partners with OpenAI and Google to Boost Siri’s AI Capabilities
On Monday, Apple CEO Tim Cook announced a significant deal with OpenAI to integrate its advanced artificial intelligence model into Apple’s voice assistant, Siri.
The Role of Google’s Technology
Amidst the excitement, a technical document released by Apple highlighted another key player in this AI development: Alphabet’s Google. To create Apple’s foundational AI models, the company used its own framework software along with a mix of hardware. This included Apple’s own on-premise graphics processing units (GPUs) and Google’s cloud-based tensor processing units (TPUs).
Google has been developing TPUs for nearly a decade and has publicly shared details about the fifth-generation chips used for AI training. The performance version of these chips is competitive with Nvidia’s H100 AI chips. Additionally, Google announced at its annual developer conference that a sixth-generation TPU would launch this year.
Advantages of Google’s TPUs
Designed specifically for running AI applications and training models, Google’s TPUs are central to its cloud computing hardware and software platform. To access Google’s chips, clients usually need to purchase them through Google’s cloud division, similar to how customers buy computing time from Amazon’s AWS or Microsoft’s Azure.
Apple did not disclose the extent to which it relied on Google’s chips and software compared to hardware from Nvidia or other AI vendors. However, the inclusion of Google’s technology underscores the competitive landscape of AI hardware and the collaborative efforts between major tech companies.
Strategic Implications
The partnership between Apple, OpenAI, and Google highlights the tech industry’s interconnected nature in advancing AI technology. As Apple aims to enhance Siri’s capabilities, leveraging cutting-edge AI hardware and software from industry leaders like Google is crucial.
Apple and Google did not immediately return requests for comment on the specifics of their collaboration.