Current location - Trademark Inquiry Complete Network - Trademark inquiry - Can AR HUD replace the central control panel?
Can AR HUD replace the central control panel?
10 years ago, when Elon Musk first introduced the control screen of Tesla Model S, everyone present knew that a new era was coming, because the last iPhone that subverted the physical keyboard with the screen had perfectly proved the power of "intelligence" with practical actions. Moreover, Tesla's central control screen not only replaces the physical buttons on the car in form, but also subverts the interactive experience of the car cockpit. Since then, "big screen" has become the standard configuration of smart cars.

Even today, after 10 years, car manufacturers are still persistent in their pursuit of "screen". Huawei's latest AITO interface M5 is equipped with 10.4-inch curved full LCD instrument panel and 15.6-inch semi-suspended central control large screen. In addition to 14.6-inch instrument screen and 16.9-inch central control screen, Gaohe HiPhi X is also equipped with 19.9-inch passenger screen. The latest Li ONE has achieved "four-screen linkage". In addition to creating new players, traditional car companies have also made great efforts on the big screen. BMW's latest iDrive 8 uses the largest and clearest curved floating screen in BMW's history. Mercedes-Benz equipped its flagship electric vehicle EQS with a screen of 1.4m! You know, the width of this car is only 1.9 meters.

On the other hand, Tesla, the pioneer of the "big screen" era, has also become conservative about the in-car screen, and only installed a 17-inch screen that can move left and right in its newly released Slaid version. This makes people wonder, is Tesla "not rolling"? Or does elon musk think that technology can make another breakthrough?

During the 202 1 Shanghai Auto Show, Huawei not only demonstrated amazing autonomous driving technology, but also released its latest HUD technology, which can project 70-inch high-definition frames at a distance of 7.5 meters, making car companies struggling in the "big screen" of cars see new hope.

HUD(Head Up Display), also known as head-up display system, evolved from fighter optical sight technology, which can project vehicle information on the front windshield of the vehicle, so that drivers can always pay attention to the road ahead without looking down at the instrument information to ensure driving safety. With the development of technology, on-board HUD can only display basic information such as vehicle speed and reminder icon. From the earliest, it gradually evolved to a stage where the display range is larger and the picture is clearer, which can completely replace the dashboard.

The progress of autonomous driving technology can free the attention of drivers or passengers from the road, put forward higher requirements for the entertainment function of HUD, and also promote the development of HUD in the direction of larger screen and clearer display. The Huawei HUD mentioned above is this route.

Another requirement of the development of autonomous driving technology for HUD is integration. The automatic driving system should plan the driving route in the established virtual space after sensing the surrounding environment. Andrej Karpathy, director of Tesla's artificial intelligence department, proposed on 20021Tesla AI that the virtual space established by the autonomous driving system needs "space fusion", that is, the positions of all objects on the road must be correct and "time fusion", that is, there can be no delay.

The virtual space established by the automatic driving system is usually displayed on the big screen of the car for the driver's supervision. When the driver's eyes leave the road, it loses the significance of supervision, so the integration of virtual information and display world on HUD becomes the best solution, and AR-HUD is born.

AR-HUD began to be widely concerned when the new S-class of Mercedes-Benz was launched in 20021early this year, and the flying arrow became a necessary demonstration function for major car companies. At the just-concluded 2022CES exhibition, AR-HUD has also become a new track for major manufacturers to compete for strength.

Panasonic has released AR-HUD 2.0 with eye tracking technology, which can optimize the eye distance according to the change of the driver's gaze point, and ensure that no matter how the driver moves his head and line of sight, he can enjoy clear and accurate AR navigation effect. Qualcomm Snapdragon car cockpit platform will introduce Phiar's computer vision and spatial AI technology to realize the car-mounted AR HUD system that can perceive the environment and navigate. CY Vision released the next generation AR-HUD, which has 3D display, dynamic focus display, eye tracking, AR images simulating human visual effects, large field of view and high brightness, and is suitable for various distances and weather conditions, and announced that it will cooperate with BMW to develop the latest AR technology. EyeLights and AGC jointly introduce the industrialized AR function into this series of vehicles, and claim that they can project a virtual screen up to 550 inches at a projection distance of 50m.

Although AR-HUD has started the trend of "quantity" at CES 2022, it is not easy to realize it technically. You may often see this kind of magic video using "borrowing", but if you are at the scene, this kind of "magic" will be easily exposed, because compared with the "one-eyed" camera, the human eye is much more sensitive to the perception of the space and distance of objects. This will cause the human eye to be unable to see the distant objects clearly if it wants to see the nearby objects clearly, but if it wants to see the distant objects, the nearby objects will interfere with the line of sight very much.

At present, most HUD technologies only project a two-dimensional picture at a fixed distance in front of the car. If the animation effect is used to simulate the fusion on this screen, it will be as embarrassing as the above-mentioned "taking advantage of the situation" magic ~ Take this AR-HUD demonstration video released by Mercedes-Benz S-Class as an example, although the virtual arrow will move with the rotation of the steering wheel, when the eyes are fixed on the arrow, the distant road is virtual, and when the eyes are focused on the distant road, the nearby arrow will turn into a mass of blue.

Therefore, for AR-HUD, the projection distance of virtual information cannot be fixed, and it needs to match the position of real objects so as not to interfere with driving. Moreover, the people, cars and environment on the road are very complicated, so the space where AR-HUD projects information is not a surface, but a 3D light field with depth. The information projected in the light field is the same as the virtual environment of autonomous driving perception, which requires accurate spatial fusion and "zero delay" time fusion. At present, there are very few companies with this display technology in the world. WayRay, a Zurich-based holographic augmented reality startup, claims that its deep reality display technology can project virtual information at different distances to achieve true "true AR".

If we can achieve the perfect integration of virtual and reality, the imagination space of AR-HUD is not limited to projecting driving information. In 20021year, Zuckerberg, the founder of Facebook, put forward the concept of "hyperuniverse" that shocked the world, and the core of the concept of hyperuniverse is the immersive interactive experience of combining reality with reality. WayRay also released the Holograktor concept car of "Metauniverse on Wheels". At present, WayRay Ray has obtained financing of $65,438+400 million yuan from giants such as Alibaba, Porsche and Hyundai Motor only by virtue of the concept of "true AR", which shows the market's expectation for the future of true AR-HUD technology.

In addition to WayRay, the domestic startup Futureus also has the technology of 3D light field AR-HUD. Futureus' unique light field AR-HUD system can realize the continuous zoom of HUD virtual images from 4 meters to infinity, and truly realize the perfect integration of virtual and real roads on the optical level. At present, FUTURUS has successfully completed three generations of iterations of AR HUD real vehicle hardware in 3D light field, and the system has been optimized by actual road test for three years. It is reported that the company will release its AR-HUD products at the Beijing Auto Show in 2022.

In addition to startups, technology giant Apple also uses AR-HUD as one of the secret weapons to subvert cars. In 20021year, the US Patent and Trademark Office granted Apple a Titan project patent: light field head-up display. Its patent content shows that the HUD on Apple's car "can be a light field display, and the light field output generated by it allows viewers to observe the three-dimensional content on the head-up display. An array of light field display cells and corresponding lenses can be used to guide the light field output to an observer. The lens can guide the overlapping light field output of the display unit to the viewer, thus forming an enlarged seamless light field viewing area on the window. "

In 2007, Apple completely transformed the mobile phone and changed the life of the keyboard mobile phone with a clean screen. In 20 12, the Model S released by Tesla also completely subverted the interactive experience of the car cockpit with a screen. 10 years later, with the development of HUD technology, a "screenless" revolution may have begun, let us wait and see.