OpenAI introduced a set of new developer tools today at its DevDay product event in San Francisco. The additions are headlined by Realtime API, a cloud service that enables software teams to equip ...
Featuring multimodal support and model distillation for training smaller AI models, the new Nova Premier signals a strategic shift by AWS in the enterprise market, analysts say. Amazon Web Services ...
Model distillation is one of the technology trends that has reached a level of maturity identified in Gartner’s 2025 Hype Cycle for artificial intelligence (AI) as “the slope of enlightenment”.
OpenAI, fresh from securing a funding boost that catapulted its valuation to $157 billion, has introduced new tools for developers, enhancing its AI capabilities with multimodal fine-tuning options ...
Hosted on MSN

What is AI Distillation?

Distillation, also known as model or knowledge distillation, is a process where knowledge is transferred from a large, complex AI ‘teacher’ model to a smaller and more efficient ‘student’ model. Doing ...
Forbes contributors publish independent expert analyses and insights. There’s a new wrinkle in the saga of Chinese company DeepSeek’s recent announcement of a super-capable R1 model that combines high ...