Industry Leaders Discuss UX Megatrends, Part 3: ADAS and Autonomy

Rightware Party at The Venetian, Las Vegas
Graphics technologies today evolving for autonomous vehicles of tomorrow

The following is the last in a 3-part article from a fireside chat held at the Rightware Party at CES 2019 in Las Vegas. For more, read part 1 on the increasing size and number of displays and part 2 on connected ecosystems.

Whenever business leaders gather to talk about the automotive industry, hot topics are almost certain to become part of the discussion. When Rightware gathered software, semiconductor, and automotive leaders for a fireside chat at its annual Kanzi Partner Program social event at the 2019 Consumer Electronics Show in Las Vegas, the conversation centered around three megatrends in the automotive industry: the increasing size of screens in cars; connectivity and app ecosystems inside the vehicle; and the evolution toward semi- and fully autonomous vehicles.

Sharing their visions were Dan Mender, vice president of business development at Green Hills Software; J F Grossen, global vice president of design for HERE Technologies; Michael Groene, director of electrical engineering at Karma Automotive; Bill Pinnell, senior director of automotive software at Qualcomm, and Ville Ilves, CEO of Rightware. The discussion was moderated by Derek Sellin, vice president of marketing for Rightware. 

The road to autonomy

Today’s vehicles have more digital screens than ever. In Part 1 of the three-part series, panelists agreed the trend of more screens and digitals real estate in vehicles will continue. That adds a new level of complexity for software companies, UI designers, developers and automakers, as discussed in Part 2 of the series.

How screens are used in vehicles in the future, and the graphical content displayed on them, is likely going to change as levels of automated driving increase. Screens within today’s vehicles provide drivers with information critical to navigating safely from Point A to Point B, while also providing limited infotainment content. As the industry moves to Level 3 autonomy, we will be able to shift, under certain traffic or environmental conditions, "safety-critical functions" completely to the vehicle. The driver will still be present and ready to intervene if necessary but will not be expected to monitor the car or traffic as we do today.

Regardless of when self-driving cars arrive en masse, today’s vehicle displays, and the information projected on them, serve as a stepping stone in the evolution toward autonomy. “As autonomy comes in, we’re not going to be (in the vehicle) just watching a video,” said Pinnell, who said drivers, or “semi-drivers” will need to be aware of the vehicle, its surroundings, and where it’s headed. Displaying vehicle-to-everything (V2X) information in the vehicle will grow in importance as levels of autonomy increase. “Showing instances like pedestrians crossing the road in front of you, or road work and things coming up, gives some contextual awareness to your journey.”

Grossen said layering V2X data, as well as other information, such as points on interest, on top of maps and routing not only helps develop consumer trust in self-driving vehicles, but also creates a more enjoyable experience in the car. “Focusing some of the information, whether it be about buildings or landmarks based on your preferences, creates a customized, personalized experience with the maps,” he said.

Cybersecurity also is a key element in protecting graphical content and the systems that deliver it, especially in an automated vehicle, Mender noted. “When you think about the amount of (digital) real estate and the rich criticality of the type of information that’s being displayed, it’s important to focus on how you are going to make sure your system is safe and secure.”

The Future is Now

The future of vehicles, specifically pertaining to digital displays and the technology behind them, is rapidly changing. The panelists agreed the future is coming fast; perhaps as soon as CES 2020 discussions will focus on machine learning and artificial intelligence linked to integrating occupants’ eye movements and emotions with information displayed on the vehicle’s screens.

The panelist noted it’s more than just the technology recognizing the driver’s state of anger or fatigue. They expect to see demonstrations soon of vehicles responding to the driver’s cues to protect the vehicle and its occupants. “Being able to track the emotion of the driver around stress situations and allow the vehicle to react to that,” said Grossen. “It’s not just ‘hey, you’re angry,’ but ‘you’re angry and we’re going to do something about it.’”

When will that happen? “I think we’ve got the technology enablers now, but I think it’s really going to come to fruition within a year,” said Pinnell.

While new technology will continue to flood the market, Groene believes it’s also time for automakers and suppliers to make time to listen to customer feedback and refine where they focus their efforts. “Right now, we are pushing, but we also will learn a lot more from our customers from the field: what is missing, what needs to be refined,” said Groene. “I think we will see a lot of lessons learned.”

Rightware thanks the panelists for participating in the fireside chat and all of its partners that create a global ecosystem of companies working together to develop and define the future of digital HMI utilizing Kanzi. Rightware also wants to thank the companies that sponsored its annual Kanzi Partner Program reception at CES: Monotype, KPIT, E-Planet, Siili Solutions, and Green Hills Software.

Read more about Rightware’s CES announcement of Kanzi for Android, bringing high-performance graphics to Android IVI systems and sharing data from Android-based apps and services across the vehicle cockpit. For a glimpse into Rightware's private suite, head over to the CES 2019 Kanzi Experience.

The Future of Automotive UX - Download presentation deck