Szenario:
This is the future. The year 2030. In addition to highly automated cars, the established carsharing companies offer highly automated mobility with highly automated "driving pleasure".
What is the meaning of "driving pleasure" when highly automated vehicles get us from A to B. What are the risks and opportunities that future user scenarios in highly automated mobility do offer. In which ways can the user control the vehicle? Which requirements, but also limits, arise in situations in which the user has to take over the control in 8 seconds? Where can controls move?
What can the cockpit of the future look like based on the UX?

Click play to see a quick animation of the AURIGA-Testdrive.

AURIGA
(lat. the charioteer) is a cockpit system for highly automated vehicles (Level 4).
Auriga controls the vehicle, communicates with the user and uses his ability of peripheral perception. The street picture with all relevant road users is projected onto the area between dashboard and windshield. This area is perceived peripherally both when looking at the street and while looking at the dashboard. So you can work while driving, watching movies or surfing the net and still be able to take the wheel when needed. It is controlled by gestures derived from intuitive behavior. To turn, you simply point in the direction you want to go. To acceleracte you shift your hands forward and to brake you pull your fingers together. In this way, you can also oversteer the autopilot.

If you are interested in the whole design process with more theory, mockups and a little bit of survey, click the link below, to view the documentation as pdf.

Projectpartners:

Milan Bergheim | http://www.aromablind.myportfolio.com
Tillmann Kayser | http://www.kayserdesign.com
Thank you for your attention!
Back to Top