Apple used Tensor Processing Units (TPUs) developed by Google instead of Nvidia's GPUs to build two critical components of Apple Intelligence, as detailed in a new research paper by Apple. The paper, highlighted by CNBC, reveals that Apple utilized 2,048 TPUv5p chips and 8,192 TPUv4 processors for AI model construction. The absence of Nvidia hardware in the paper suggests a deliberate choice to favor Google's technology over Nvidia's widely-used GPUs. This decision is significant given Nvidia's dominance in the AI processor market, where their GPUs are known for performance and efficiency.
Unlike Nvidia, Google offers its TPUs through cloud services, requiring customers to develop software within Google's ecosystem, which includes integrated tools for AI model development and deployment. Apple's engineers explained that Google's TPUs enabled efficient training of large, sophisticated AI models, leveraging the processing power of large TPU clusters. Apple plans to invest over $5 billion in AI server enhancements in the next two years, aiming to bolster its AI capabilities and reduce dependence on external hardware.
The paper also addresses ethical considerations in AI development, with Apple emphasizing responsible data practices and stating that no private user data was used for training. The training data comprised publicly available, licensed, and open-sourced datasets, curated to protect user privacy.
macrumors.com
macrumors.com
Create attached notes ...
