Particle.news

Download on the App Store

Foxconn Launches Taiwan's First Large Language Model, 'FoxBrain'

The AI model, built using Nvidia GPUs and Meta's Llama 3.1 architecture, aims to enhance manufacturing and supply chain efficiency.

  • Foxconn unveiled 'FoxBrain,' a large language model optimized for traditional Chinese and Taiwanese language styles.
  • The model was trained in four weeks using 120 Nvidia H100 GPUs and is based on Meta's Llama 3.1 architecture.
  • FoxBrain is designed for internal applications, including data analysis, decision-making, mathematics, and code generation, with plans to open-source it for broader industry collaboration.
  • Foxconn intends to use the AI model to drive advancements in manufacturing and supply chain management, reflecting its diversification into AI and electric vehicles.
  • The company will provide more details about FoxBrain at Nvidia's GTC developer conference in mid-March.
Hero image