Hyvä muistaa, että Lidarillahan on myös omat heikkoutensa (ei vain hinta), kuten tietenkin kameroillakin. En itse välttämättä tässä vaiheessa menisi vielä veikkaamaan, että tulevaisuudessa kaikki järkevät käyttävät Lidaria.
Teslan kohdallahan esiin on nostettu Arbe Roboticsin korkean resoluution Phoenix-tutka, joka hintatasoltaan paljon halvempi kuin Lidar eikä sen toiminta rajoitu valotutkan tavoin, kun sääolosuhteet muuttuvat haastaviksi. Pitäisin itse silmällä myös näitä korkean resoluution ”4D-tutkia"
Tesla is adding a new '4D' radar with twice the range for self-driving | Electrek
http://www.arberobotics.com/wp-content/uploads/2018/06/Wardsauto-arbe-story.pdf
”Arbe’s technology is an essential component in achieving a fully autonomous vehicle that drives in any environment and weather condition. Can you tell us more about the technology behind Arbe?
Currently, most autonomous vehicle sensing suites include two or three types of sensors: camera, radar and in some cases Lidar. The reason several technologies are being used together is that each has strengths and each has weaknesses. You cannot rely on any of them independently. For example, while cameras deliver 2D resolution and Lidar 3D resolution, both lose functionality in common environmental conditions such as darkness, pollution, snow, rain, or fog.
Radar, which is based on radio waves, maintains functionality across all weather and lighting conditions. However, the technology has been limited by low resolution, a disadvantage that has made radar very susceptible to false alarms and inept at identifying stationary objects. Until now, that is.
What we’ve been able to do at Arbe is remove radar’s resolution limitation, infusing this super dependable technology with ultra high-resolution functionalities to sense the environment in four dimensions: distance, height, depth and speed. In the autonomous driving industry, this technological advancement effectively repositions radar from the role of a supportive sensor, to the backbone of the sensor suite.”
The primary issue is that autonomous cars still can’t see well enough to safely maneuver in heavy traffic or see far enough ahead to handle highway conditions in any kind of weather.
Video cameras can be foiled by glare. Standard radar can judge the relative speed of objects but has Mr. Magoo-like vision. Ultrasonic sensors can sense only nearby objects — and not very clearly. Lidar (formally, light detection and ranging), while able to create 3-D images of people and street signs, has distance limitations and can be stymied in heavy rain. And even the most sophisticated artificial intelligence software can’t help if it doesn’t have the perceptual data to begin with.
Radar companies are also working on improvements. Arbe Robotics is developing high-resolution so-called 4-D imaging radar that can create detailed images at distances of over 900 feet. Arbe’s chief executive, Kobi Marenko, said the company was doing field tests now.
“Current sensors that are being used are not good enough yet,” Mr. Marenko acknowledged, referring to conventional Doppler radar. But he said more advanced radar and camera systems could solve the problem without the need for more expensive or offbeat technology.
Xilinx ja Continental ovat myös yhteistyössä samantyyppisen tutkan kanssa