Affectiva and Nuance Communcations are collaborating “to further humanize” in-car artificial intelligence. Affectiva, a spin-off from the MIT Media Lab based in Boston, Massachusetts, has developed an ‘Emotion AI’ which can measure facial expressions and assess emotions such as joy, anger and surprise in real-time, and pick up upon vocal expressions of anger, engagement and laughter; it also tracks indicators of drowsiness, i.e. yawning, eye closure and blinking rates, and physical or mental distraction.
The Affectiva Automotive AI will be integrated with Nuance’s Dragon Drive automotive assistant platform, a conversational AI powered by natural language understanding for interactions via gesture, touch, gaze detection and voice recognition. Dragon Drive already features in over 200 million cars on the road, supplied to Audi, BMW, Daimler, Fiat, Ford, GM, SAIC, Toyota and others across 40 languages.
The integration will, in the near-term, help the AI assistant to ‘learn’ and understand driver and passenger emotion and behaviour, and make responses and recommendations accordingly. In the longer-term, this can be used to address safety-related issues i.e. taking control of an automated vehicle if the driver is judged to be distracted or drowsy.
Stefan Ortmanns, executive VP and general manager, Nuance Automotive, said in a statement: “As our OEM partners look to build the next generation of automotive assistants for the future of connected and autonomous cars, integration of additional modes of interaction will be essential not just for effectiveness and efficiency, but also safety. Leveraging Affectiva’s technology to recognise and analyse the driver’s emotional state will further humanise the automotive assistant experience, transforming the in-car HMI and forging a stronger connection between the driver and the OEM’s brand.”
Dr Rana el Kaliouby, CEO and co-founder of Affectiva, added: “We’re seeing a significant shift in the way that people today want to interact with technology, whether that’s a virtual assistant in their homes, or an assistant in their cars. OEMs and Tier 1 suppliers can now address that desire by deploying automotive assistants that are highly relatable, intelligent and able to emulate the way that people interact with one another. This presents a significant opportunity for them to differentiate their offerings from the competition in the short-term, and plan for consumer expectations that will continue to shift over time.”
BMW and Groupe PSA launch AI assistants
Both the BMW Group and Groupe PSA have this week announced the addition of AI assistants to their in-car technologies. BMW’s Intelligent Personal Assistant (IPA) available from March 2019 [pictured, above], is voice-activated and awakened by a ‘Hey BMW’ command; it is compatible with other digital voice assistants as well as Amazon Alexa and can be used remotely via a smart speaker at home or a smartphone. Its software will be remotely updated over-the-air and new functions or skills added via BMW’s Open Mobility Cloud.
The BMW IPA ‘learns’ routines and habits, driver preferences and frequent journey destinations: it will respond to a ‘take me home’ command, for example. It can provide status information such as oil level, explain certain functions or, for example, dashboard warning lights, and BMW promises it can even get conversational: drivers can give it a name, too. The IPA can also synch with diary and calendar data, arrange service appointments, and integrate with Microsoft Office 365 and Skype for office functions, besides seeking desired entertainment such as music by genre.
Similarly, Groupe PSA has confirmed the availability of a smart voice assistant from 2020, to be offered in Peugeot, Citroen, DS, Opel and Vauxhall vehicles. This too will use natural language recognition and PSA cites its application for tasks such as finding a restaurant and other connected services, as well as for regulating cabin temperature, ventilation and suchlike.