Peering through the pea-souper: New sensor technologies will help autonomous vehicles to see clearly

  • Several new image sensors in development
  • Opportunities not to be mist
  • They will augment rather than replace established LIDAR
  • Market growth potential is no mirage

“I can see clearly now the rain has gone”, warbled Johnny Nash back in 1972. Fifty years on and it might soon be “a bright, bright, sunshiny day” for autonomous vehicles (AVs).  IDTechEx has published a new report, “Emerging Image Sensor Technologies 2021-2031: Applications and Markets” that provides considerable insight into innovative image sensing technologies, including short wavelength infra-red (SWIR) of 1,000 to 2,500 nanometres, event-based sensors and other emerging image sensor technologies that will allow AVs to ‘see’ even in challenging environmental conditions, such as thick fog.

With the continuing (and expensive) development of AVs, the search is on to provide image sensing solutions that will provide them with a constant, clear picture of their surroundings, whatever they may be and wherever an AV may be. The provision of absolutely crystal-clear displays will require not just multiple sensors but multiple types of sensors on and in a vehicle that will be able to instantly provide information with extreme precision, robustness and redundancy. It’s a significant challenge, but R&D continues apace.

In recent years most AV sensor research has been focused on light detection and ranging (LIDAR), a laser technology used in 3D applications to capture the range (variable distance) of an object or a space. LIDAR is widely applied in AVs, robotics, space exploration, surveying, construction and engineering. With AVs, LIDAR is used to enable obstacle detection and avoidance. 

Current LIDAR systems on AVs exploit rotating hexagonal mirrors that split the laser beam. The three upper beams are used to sense other vehicles and obstacles ahead, while the lower beams are used to detect lane markings and road features. One of the major advantages of LIDAR is that it can be amalgamated with other sensors, such as radar, to provide a much-improved view of the environment (static or in motion) through which an AV is passing. However, a big problem with LIDAR is that dust, rain, road spray, snow and fog causes laser emissions to bounce back and degrade the ‘view’, hence the quest to augment LIDAR with new data from different sensors.

This is where SWIR comes in. While indium gallium arsenide (InGaAs) sensors are effective adjuncts to LIDAR in high-speed and high-sensitivity situations, they are very expensive. That’s why scientists are searching for an alternative and less-costly technology that can detect light towards the lower end of the SWIR spectrum and thus provide better vision in adverse conditions.

Conventional silicon photodetectors cannot discern light with wavelengths above 1000nm and researchers are seeking, via the extended-silicon approach, to increase the thickness of silicon photodetectors and etch a different structure onto their surfaces to increase their sensitivity beyond 1000nm. Then, by using existing chip-fabrication techniques, it is hoped, and indeed expected, that the new and cheaper products will be ideally suited to use in AVs and advanced driver assistance systems (ADAS).

The IDTechEx report also covers event-based vision, another technology that is expected to be widely used in AVs. It is a dynamic visual information acquisition methodology that records only the changes in a given scene rather than recording the entire scene at regular intervals. An array of independent pixels asynchronously detect and acquire changes in the scene to deliver faster decisions with less data, which is exactly what AVs need to help keep them safe.

They also work better than conventional visible sensors where visibility is less than optimal. As the IDTechEx report said, “By just detecting changes in the scene, pedestrians or other moving vehicles can be clearly identified since a complex background is not detected. Furthermore, since each pixel is triggered by the relative intensity change, event-based vision offers a significantly higher dynamic range that improves image detection at low light levels.”

The report concluded that SWIR and event-based vision sensors will be additions to the ranges of AV sensing technologies and will not replace LIDAR, radar, and cameras that are already established for ADAS systems. “Integrating information from multiple sensor types covers the weaknesses of individual approaches, provides redundancy, and ultimately will enable safer and increasingly autonomous vehicles,” it added.

Indeed, according to the MarketWatch website, a subsidiary of Dow Jones, the global SWIR market will grow quickly, at a compound annual growth rate of 8.1%, during the next five years and will be worth US$2.11bn by 2028. 

IDTechEx, headquartered in Cambridge, England, is a long-established, independent market research and business intelligence company with subsidiaries in Japan, Germany and the US.

Email Newsletters

Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.