Hon Hai Research Institute, under the umbrella of Hon Hai Technology Group (Foxconn), has announced the launch of FoxBrain, Taiwan’s first Traditional Chinese Large Language Model (LLM). Developed in just four weeks using an innovative, cost-efficient training method, FoxBrain is set to redefine Taiwan’s AI landscape, particularly in reasoning capabilities and language optimisation.
A Milestone in Taiwan’s AI Development
FoxBrain represents a significant advancement in AI technology, optimised specifically for Taiwanese users and Traditional Chinese language processing. The model was initially designed for internal applications within Foxconn, supporting data analysis, decision-making, document collaboration, mathematics, coding, and problem-solving. However, the company has confirmed plans to release FoxBrain as an open-source model, allowing broader industry collaboration and innovation.
The AI system has demonstrated exceptional performance in logical and mathematical reasoning tests, outperforming existing models of similar scale, including the Taiwan-Llama-70B. Its development underscores Taiwan’s ability to compete with international AI researchers despite limited computational resources.
Advanced Training Strategies and Technical Superiority
FoxBrain is built on Meta’s Llama 3.1 architecture, incorporating 70 billion parameters. It was trained using 120 NVIDIA H100 GPUs connected via NVIDIA Quantum-2 InfiniBand networking, which significantly enhanced data transfer speeds during training. The entire process was completed in just four weeks, setting a new benchmark for efficiency in AI model development.
Dr. Yung-Hui Li, Director of the Artificial Intelligence Research Center at Hon Hai Research Institute, highlighted the efficiency of the training process:
“In recent months, the deepening of reasoning capabilities and the efficient use of GPUs have become mainstream AI developments. Our FoxBrain model adopted a very efficient training strategy, focusing on optimising the process rather than blindly accumulating computing power.”
Foxconn employed a proprietary data augmentation technique, generating 98 billion high-quality pre-training tokens across 24 topic categories. The model boasts a context window of 128,000 tokens, enabling it to process extensive text inputs with superior contextual awareness.
Key technical features of FoxBrain include:
- Proprietary data augmentation and quality assessment techniques for 24 topic categories
- 120 NVIDIA H100 GPUs used in training over 2,688 GPU days
- Multi-node parallel training for optimal performance and stability
- Implementation of Adaptive Reasoning Reflection for enhanced autonomous reasoning
Performance and Future Applications
Benchmarking against the TMMLU+ dataset revealed that FoxBrain surpasses other Traditional Chinese LLMs, including Taiwan Llama and Meta Llama 3.1, particularly in mathematical and logical reasoning. While it still falls slightly behind DeepSeek’s distillation model, its performance is approaching world-class AI standards.
Although originally developed for Foxconn’s internal applications, FoxBrain is expected to play a key role in smart manufacturing, supply chain management, and business decision-making. The company plans to collaborate with industry partners to further expand its applications, leveraging AI to drive innovation across multiple sectors.
FoxBrain to be Showcased at NVIDIA GTC 2025
Foxconn will present FoxBrain at the upcoming NVIDIA GTC 2025 conference on March 20 in a session titled “From Open Source to Frontier AI: Build, Customise, and Extend Foundation Models.” During the session, the company will provide insights into FoxBrain’s development, capabilities, and future applications.
Foxconn’s partnership with NVIDIA played a crucial role in the model’s success, with technical support and computational resources provided via the Taipei-1 Supercomputer. The collaboration underscores the growing importance of AI in Taiwan’s technology ecosystem.
As FoxBrain moves towards an open-source future, it signals a major step forward in Taiwan’s AI ambitions, providing a competitive edge in the rapidly evolving artificial intelligence landscape.
Source: https://www.honhai.com/en-us/press-center/press-releases/latest-news/1549