It’s undeniable that with every passing day, we inch closer to the dawn of autonomous cars. In every shape and form, self-driving technologies are being developed and tested on private courses — and, in some situations, even on public roads. Although there is no set date on what many believe to be a revolutionary point in human history, we do have a sense of what the technology will look like and how it will be implemented. The futuristic concept cars we’ve seen over the years have not evolved aesthetically beyond aerodynamic white plastic and glass shells. Instead, the innovation lies in the artificially intelligent inhabitants of tomorrow’s vehicles that are no longer referred to as cars, but as “mobility solutions.”
In January, at the 2019 Detroit Auto Show, a large portion of Toyota’s booth was devoted to futuristic concept cars, with a spotlight on their Concept-i. The Concept-i is not new, having debuted at the Consumer Electronic Show of 2017, and it has toured the expo circuit since. While many visual cues of the Concept-i are similar to futuristic-looking concept cars of the past, such as the 1992 General Motors Ultralite from the Sylvester Stallone dystopian thriller “Demolition Man,” Toyota’s Concept-i attempts innovation on the software side of autonomy with the introduction of Yui, a UX-powered artificial intelligence designed to learn from and interface with the driver. Evolving personal assistant software such as Apple’s Siri or Amazon’s Alexa into the heart of the automobile will purportedly enable the car to monitor your involvement, to take over when you’re too tired, and to find amenities on a trip. While these innovations are not groundbreaking, they are part of an evolutionary path that will gradually secede control and freedom from human input.
Artificial Intelligence seems to be at the core of our autonomous future, and Toyota’s Yui is not the only software-powered co-pilot in development. Michigan-based automotive supplier Denso demonstrated their technology recently at the Detroit Auto Show. In their VR-demo, AI takes control from the driver into autonomous mode — and once in autonomous mode, the screen-based windshield display shifts from a picture of a bleak highway commute into a resemblance of a smartphone display with apps and widgets. An unexpected meeting pops up on the calendar, and the car then reroutes. Sensing the stress of the operator, the in-car AI selects Hawaii from a list of picturesque escapes, and suddenly the road vanishes, replaced with a sunset seascape.
As I calmly sat through the VR demo, I was terrified to see the roadway vanish in front of my eyes. While allowing AI to move a vehicle is one evolutionary step that I may eventually come around to, surrendering the ability to see where you’re going is an entirely different beast. Later in the demo, when you arrive at the destination, the car then states, “Once your meeting is over, I’ll be here to pick you up.” While there’s a lot of convenience here, especially in a car that you don’t need to park, I can’t escape the feeling that the autonomous car and intelligent, car-based AI solely exist to shepherd us from one appointment to the next, shielding us from the banal, as our lives pass us by like a progress bar or loading screen.
There’s an endless list of advantages with autonomous cars, as they are largely based around removing human error from the roadway. Drunk driving incidents could turn into a drunk destination choice like the Grand Canyon, and distracted driving can be eliminated completely. It’s evident now that smartphone-enabled technologies will creep their way into car connectivity and automation — and while I wholeheartedly believe we spend too much time in a zombie-like state staring at our phones, I can only imagine what life will be like when our cars are our phones, and we’ll no longer have to drive.