ARHUD driving navigation technology overview

6e943502af3e6840d2e58208792c0ec2.png

ARHUD (Augmented Reality Head Up Display), the combination of augmented reality and head-up display, is a technology that projects rendering elements into the real world. It is currently the lowest cost display method for users to understand.

The first application of the HUD function was in World War II, where it was applied to firearms and fighter jets. In the early 1980s, it began to be used for civilian use. In the early 1990s, the technical concept was formally proposed and evolved into a function on automobiles. In fact, there are many military-to-civilian configurations on cars, such as inertial navigation devices.

ARHUD driving navigation projects important navigation information such as speed limits, steering movements, and guidance lines directly in front of the driver's field of vision, allowing the driver to view the driving guidance information without lowering his head or turning his head.

AutoNavi has done a lot of research and development work in ARHUD driving navigation and has industry-leading technical reserves and practical experience. In August 2022, AutoNavi Maps cooperated with BAIC and Huawei to launch BAIC Rubik's Cube ARHUD navigation.

1.Virtual Image Distance

9ea3f15781bbaa38d8ef9a553e3b6aff.png

Virtual Image Distance, referred to as VID, simply put is the visual distance from the virtual image to the human eye. Everyone knows that human eyes also have focal lengths. The focal lengths for seeing far and near are different, so if the distance of VID is not far enough, When looking far away, the ARHUD display will be blurred due to the focus of the eyes.

The VID distance of traditional HUD is about 2.5 meters, while the VID distance of AR HUD is often more than 10 meters. To achieve cross-lane display, the projection distance needs to be 20 meters.

The traditional W-HUD can actually be understood as a projector, which reflects the image and projects it onto the windshield (such as the HUD projection function of the Amap on the mobile phone). In fact, it is equivalent to projecting the information originally displayed on the dashboard onto the windshield. On the windshield. In fact, this is also the original intention of HUD design - the driver does not need to lower his head to obtain relevant information about the vehicle's driving.

However, the image size of W-HUD is limited (usually the projection distance is 3m, the display size is 15-20 inches), the information that can be displayed is less, and the image will not merge with the road. The driver still needs to move his eyes away from the road and re- Information can only be obtained by focusing, which actually violates the original design intention of the HUD.

2.Field Of View

89ce369e0e2bc770ddc0332bcfc866c9.png

Field Of View, referred to as FOV, includes the horizontal field of view and vertical field of view centered on the driver's eyes. The FOV of traditional HUD is very small, generally only 5 degrees. The horizontal field of view of AR HUD must be above 10°. The ARHUD of Ideal ONE can reach 20°, and the ARHUD of Wenjie M5 can also reach 13°.

3. Eye Point (Eye Point)

Human eye coordinates (x, y, z), relative to the position of the vehicle body coordinate system, with the center of the vehicle front as the origin of the coordinates, in meters.

The human eye coordinates will dynamically adjust according to the driver's height, sitting posture, and head position.

429e4dc5f9e82e7e6f0f83adb9ff6ec4.png

5536c989ff4f5faeb305a7728fad5a8e.png

4. Virtual image rotation angle (three degrees of freedom)


4.1. Rotation angle along the X-axis (LDA, bottom view)

a94f3b826f3693510e7ae4b20ece5702.png

4.2. Rotation angle along Y axis (roll angle)

ccbf73f4992baa2f157411f1b36fd7bc.png

4.3. Rotation angle along the Z axis (orientation angle)

044966e72060978549de7c0bffcf614e.png

5. Virtual image coordinate conversion (world coordinates to virtual image coordinates)

First, let's take a look at how world coordinates are converted to pixel coordinates in camera projection.

fe87e33c84a6438c8e538ae3bb2826fa.png

Then, take a look at how world coordinates are converted to virtual image coordinates (the unit is also pixels) in HUD projection.

On the premise that the virtual image distance, field of view angle, human eye position, and virtual image angle are known, world coordinates and virtual image coordinates can be converted to each other.

53fe411f4e0cf0588f979d607bd0d406.png

By comparing camera projection and HUD projection, we can find that the focal length in camera projection is closely related to the virtual image distance in HUD projection.

Human eyes also have focal lengths, and the focal lengths for seeing far and near are different. Therefore, if the distance of the virtual image is not far enough, the display of ARHUD will be blurred due to the focal length of the eyes when looking far away.

Therefore, the virtual image distance is related to the focal length of the human eye.

If the virtual image distance is too small, the driver needs to look away from the road and refocus to see the information on the HUD clearly, which actually violates the original design intention of the HUD.

6.Application of coordinate transformation

4adc3c74d753833a31040309c37d55eb.png

6.1. Verify whether the virtual image projection is accurate

Problems faced : The main purpose of virtual image projection is to project real-world coordinates into the virtual image. If accurate correspondence cannot be achieved, the accuracy of ARHUD will be affected.

Solution : The hardware system inputs the projection parameters - virtual image distance, field of view angle, human eye position, virtual image resolution, virtual image angle, and calculates the projection matrix, through which the virtual image coordinates and the vehicle body world coordinates can be converted. .

By taking several representative pixel coordinates (usually nine points) on the virtual image and converting them into the world coordinates of the vehicle body, the visible range of the virtual image can be calculated - the farthest visible, the nearest visible, the leftmost visible, and the rightmost visible. , the center is visible.

45e1eb776fe0d7df712a532e8013e176.png

Place the marking object (in front of the car) on the calculated visual range, and check whether the position of the marking object in the virtual image coincides with the nine points. If they coincide, it means the projection is accurate. If they do not coincide, the projection error is large, which is required. Notify the hardware system to make adjustments.

6.2. Solve the problem that the lane change guide line exceeds the virtual image display area

Problems faced : The lane change guide lines in AR navigation fit the real world and point to adjacent driving lanes. If the visual range of the virtual image cannot cover the adjacent lanes, the lane change lines will exceed the display area.

Solution : Based on the lane change information (change lanes to the left, change lanes to the right, change to several lanes), take several trend pixel coordinates on the virtual image, convert them into vehicle body world coordinates, and finally project them. Because the coordinates are taken on the virtual image, they will never exceed the virtual image display area.

f239724028c2be44bc4c37a0ff6e1cee.png

7.ARHUD hardware technology

7.1. TFT

That is TFT-LCD. Its principle is that the light emitted by the LED passes through the liquid crystal unit and projects the information on the screen.

Advantages : This solution is the earliest projection solution developed in the industry. It is mature and relatively low-cost. (Currently, foreign suppliers can do it for about 2,500-3,000, and local suppliers can do it for about 2,000. With the maturity of technology and the development of related industrial chains, the cost should be further within 2,000).

Disadvantages : The problem of sunlight backflow is difficult to solve. The brightness is not enough and the display effect is poor during the day.

910483b32cb70c3dfd4a27db0d24ad6c.png


7.2. DLP

It is the abbreviation of Digital Light Processing. It uses TI's DMD chip to digitally process the image signal and then project it.

Advantages : DMD chip can ensure that the projected moving images are colorful, delicate, realistic, natural and real. Thanks to digital processing, defects in the image can be erased. DMD chips are smaller and easier to carry.

Disadvantages : The cost is more expensive (cost is more than 5,000 yuan).

DLP may cause a rainbow effect, where the color mixing and conversion of image signals is abnormal during digital processing.

Since DLP displays need to use TI's DMD chips and involve technology patents, they are only used in two models: Mercedes-Benz and Trumpchi.         

600d3f80e50adc76d18d7d260125c2dc.png

4ba595d155aa033ee7ee16631868d3c0.png

7.3. LCOS

That is the abbreviation of Liquid Crystal on Silicon, that is, liquid crystal with silicon, also called silicon-based liquid crystal. It is a matrix liquid crystal display device with a relatively small size based on a reflective mode. This matrix is ​​processed on a silicon chip using CMOS technology. At present, Huawei and Yishu Technology mainly adopt this solution in China.

Advantages : In the overall reflection mode, the light utilization efficiency is high and the picture is more natural. The price is controllable, and CMOS technology is mastered by multiple manufacturers, avoiding the situation where DMD chips are exclusively monopolized by Texas Instruments. There is a metal light-shielding layer between the reflective layer and the silicon substrate circuit, which can effectively prevent sunlight from pouring back.

Disadvantages : At present, the overall technology is not very mature, there is no large-scale mass production, and further development is needed. The visible area of ​​HUD is small, and the size of the projector is relatively large.

19f639a153f4bea4201c8fd2f5a600a2.png

9d5db2643e0ed69177f8f5d4f452ba56.png

8.Main technical difficulties of ARHUD

  • The market is small

The FOV of ARHUD devices currently on the market is too small, and the image can only be displayed in a small part of the driver's line of sight.

  • Projection brightness

The brightness of HUD images, in order to cope with different external light, climate and other influences, requires higher brightness to achieve better image quality and visual effects.

  • Hardware volume

To reduce the overall volume of the HUD system, the limitations of the existing TFT/DLP modules and the high demand for FOV will make the HUD system larger and larger, which will conflict with the space allocation of the car body.

  • Real scene fit

It needs to be corrected in real time through various road network data, sensor data, GPS signals, etc. Ensure that AR graphics match real road conditions.

  • human eye position

How to dynamically monitor the position of the human eye and adjust the image projected by ARHUD to avoid problems such as blurring and misalignment of the image will test the capabilities of HUD manufacturers.

Conclusion

Since the development of ARHUD technology, it has become a battleground in the driving navigation industry. It can be expected that Apple’s ARHUD will gradually move closer to driving navigation in the future. Of course, there are still many technical difficulties that need to be overcome to improve the user experience and truly realize "what you see is what you get" within the navigation field of view. What is gratifying is that during the rapid development of ARHUD technology, we have seen the efforts of many domestic companies. I hope more Chinese technologies will shine in the ARHUD field in the future!

Note: The distribution diagram in this article comes from the Internet. If there is any infringement, please contact us to delete it.

Follow "Amap Technology" to learn more

Guess you like

Origin blog.csdn.net/amap_tech/article/details/131467214