At CES 2019, Nissan’s Invisible-to-Visible technology made its debut and painted the picture of an entirely new driving experience created through connected worlds.
Invisible-to-Visible, or I2V, is a groundbreaking technology that will merge the real and virtual worlds to help drivers see the invisible. From being able to see what is around the next turn to having an avatar from the Metaverse (a type of virtual reality world) join you as a passenger for your ride, this technology will make driving more enjoyable and exciting for the driver.
I2V is made possible by Nissan’s Omni-Sensing technology, which aggregates real-time data from the environment around and inside the car via multiple inputs. First, it uses driver and passenger sensors to capture responses from inside the vehicle in real-time (perhaps a driver is getting drowsy behind the wheel and may need help to find a coffee shop for a short break). Next, ProPILOT’s 360-degree sensors collect information around the vehicle and remote sensing is used to collect data on road conditions, traffic and weather. The data collected from these inputs are processed and then sent back to the vehicle in real-time — thus giving the driver a full understanding of their surrounding conditions.
During an autonomous driving experience, I2V can make driving more enjoyable by providing a guide from the Metaverse. This would make it possible for family and friends — or other personalities to appear inside your car as you drive. If you’re in a new city, the system can search for a knowledgeable guide for you who would be able to communicate to you in real-time information about your location.
What other types of conversations could you have with avatars from the Metaverse?
“These can range from a casual conversation partner to driving guidance, language study, business and personal consulting and counseling, all done in the same shared space as the user.” Tetsuro Ueda, a technology expert at the Nissan Research Center explained.
The avatar will not be able to take control of the vehicle, but in autonomous driving situations, Nissan envisions a special operator having the ability to relay instructions to an automated-driving vehicle — along the lines of the SAM (Seamless Autonomous Mobility) concept Nissan announced at CES 2017. However, Nissan is not thinking about transferring the control of driving operations.
If you’re wondering about the safety factor of a driver joining the Metaverse while driving, rest assured that it won’t be possible to fully dive into this virtual world while driving. The technology will use elements of the virtual world for users to be engaged in conversations with avatars. It differs technologically from a full VR experience, and while that is possible, Nissan states that they don’t believe this type of use would work for feasible driving experience.
In a manual driving experience, I2V collects the data from the Omni-Sesnsing technology and overlays it for the driver. This would be able to help you in driving situations that involved poor visibility, abnormal road surface conditions or even oncoming traffic. Driver stress is reduced by receiving assistance on where to find a vacant parking space, or alternate routes to take during traffic. This information has the ability to be so specific that it could recommend the best-moving lane for you to be in during heavy traffic.
Nissan’s Invisible-to-Visible technology was on display at CES 2019 in Las Vegas, Nevada last week and offered visitors the chance to experience a 3D, interactive immersion experience through augmented reality goggles. The experience simulated a guided tour through various scenarios, including a city tour, help to find a parking place and seeing a rainy day turn into a sunny one. There was even a professional driver avatar that hopped into the passenger seat to help drivers improve their driving skills.
Part of Nissan’s goal is to present information and data in a human like way, and they believe that by the late 2030s, the Metaverse will be the next generation of human communication. We still have a way go from this vision being a reality due to current limitations with network speeds and AR. While this technology will be available anywhere that has access to the internet, it will depend on the introduction of 5G network solutions. The speed of the wireless technology paired with AR technology evolution are the key factors in making this technology a reality and opening up endless possibilities for communication.