Trending News- Apple has officially announced and disclosed that it trained its AI models using Google’s Tensor Processing Units (TPUs), a significant divergence from the industry standard of utilising Nvidia GPUs. In a research paper published by the hardware company on Monday, they shared that two essential components of its AI system were pre-trained on Google-designed processors.
Wow, Apple Foundational Models, both on device and server, were fully trained on Google TPU clusters.
They didn’t use Nvidia in any training for it pic.twitter.com/Kn4G9AVfJn
— Max Weinbach (@MaxWinebach) July 29, 2024
According to the company, it used two distinct iterations of Google’s in-house tensor processing units (TPUs), which are arranged in sizable chip clusters. The company used 2,048 TPUv5p chips for its AI model designed for iPhones and other devices and 8,192 TPUv4 processors for its server AI model.
This strategic action highlights the growing competition in artificial intelligence technology and the changing alliances amongst the biggest tech companies as they work to advance their AI capabilities.
In addition, the maker of iPhones presented a sneak peek at its much-anticipated Apple Intelligence system.
During its WWDC conference in June, Apple also announced a plethora of new AI tools, features, and capabilities, including the integration of OpenAI’s ChatGPT into its software. Within this week, a few Apple Intelligence features will be made available to beta testers.
To catch up with all the national and international industry updates, follow our social handles @BIZBoost and stay tuned to BIZBoost news.
0 Comments