SambaNova debuts Hume AI voice models for emotional speech AI
SambaNova has launched Hume AI's new emotionally intelligent voice models, EVI 4 Mini and Octave 2, exclusively on its SambaCloud platform, providing enterprises with access to real-time, multilingual, emotionally responsive speech AI.
The new models are designed to interpret emotional tone, intonation, and context in under 300 milliseconds, aiming to create more human-like AI interactions for enterprise deployments. This technology is available in 11 languages, including a new Japanese language model, and supports enterprise use cases such as customer service, instant personality cloning, and lifelike AI companions.
Exclusive rollout
The EVI 4 Mini (Empathic Voice Interface) and Octave 2 models are built to run end-to-end on SambaNova's full-stack AI infrastructure. They are accessible via the high-speed SambaCloud platform, which also supports large language models such as Llama and DeepSeek, and allows enterprise teams to deploy next-generation voice AI with up to four times lower total cost of ownership compared to competitors.
Alan Cowen, Chief Executive and Chief Scientist at Hume AI, commented on the collaboration and technical achievement:
"With SambaNova, we can deliver real-time, emotionally intelligent AI that actually listens, understands, and responds like a human, and does it at enterprise scale. SambaNova's platform gives us the speed, flexibility, and efficiency we need to bring expressive voice AI into the real world - including complex, real-time translation use cases like English-to-Japanese."
This initiative follows a longstanding partnership between the two companies and responds to increasing demand for conversational AI that can interpret and react to emotional cues as a human would, especially across global markets and languages.
Scalable and responsive AI
Rodrigo Liang, Chief Executive and Co-founder of SambaNova, said the new capabilities are already accessible to enterprise users via SambaCloud:
"Voice AI is fundamentally transforming communication for enterprises-enhancing how organizations connect with customers, employees, and users across every interaction. Today, Hume's EVI 4 and Octave 2 models are already exclusively available on SambaCloud, delivering users expressive, emotionally intelligent voice capabilities within a secure, high-performance infrastructure they trust. This announcement deepens our longstanding partnership with Hume, providing enterprise clients access to the fastest, most scalable, and emotionally aware AI experiences in the market."
The models are engineered to understand the nuances of what is being said and how it is said, enabling applications that require contextually accurate, emotionally expressive responses in real time. According to both companies, this step marks a notable development in bringing expressive and responsive AI capabilities into real-world enterprise settings.
Industry perspective
Discussing the importance of low-latency response in voice AI, Hayley Sutherland, Research Manager for Conversational AI at IDC, stated:
"Ultra-low latency is critical for speech models to provide an efficient and human-like interactive experience. Without it, voice-based conversational AI risks increasing customer frustration and handle times. This close partnership with Hume.AI speaks to SambaNova's ability to provide key infrastructure with ultra-low latency for voice AI."
SambaNova's inference platform underpins the deployment of the models by providing 100-300 millisecond response times, full end-to-end hosting, native integration with large language models, private deployment for secure enterprise environments, and a lower cost structure compared to competing solutions.
The technology aims to eliminate robotic speech and reduce real-time lag, resulting in more natural-sounding AI voices for enterprise contexts. This may benefit sectors such as customer service, healthcare, gaming, and internal enterprise tools, by allowing more engaging and realistic conversational experiences.
SambaNova and Hume AI continue to work together to enable AI solutions that understand and respond to human emotion, aligning with Hume AI's stated goal of developing AI that can better serve human objectives through the recognition of emotional cues.