(California Institute of Technology) – Imagine driving home after a long day at work. Suddenly, a car careens out of an obscured side street and turns right in front of you. Luckily, your autonomous car saw this vehicle long before it came within your line of sight and slowed to avoid a crash. This might seem like magic, but a technique developed at Caltech brings the concept closer to a reality.
With the advent of autonomous vehicles, advanced spacecraft, and other technologies that rely on sensors for navigation, there is an ever-increasing need for technologies that can scan for obstacles, pedestrians, or other objects. But what if something is hidden behind another object?
In a paper recently published in the journal Nature Photonics, Caltech researchers and their colleagues describe a new method that essentially transforms nearby surfaces into lenses that can be used to indirectly image previously obscured objects.
The technology, developed in the laboratory of Changhuei Yang, Thomas G. Myers professor of electrical engineering, bioengineering, and medical engineering; and Heritage Medical Research Institute investigator, is a form of non-line-of-sight (NLOS) sensing – or sensing that detects an object of interest outside of the viewer’s line of sight. The new method, dubbed UNCOVER, does this by using nearby flat surfaces, such as walls, like a lens to clearly view the hidden object.
Most current NLOS imaging technology will detect light from a hidden object that is passively reflected by a surface such as a wall. However, because surfaces such as walls predominantly scatter light, the techniques do not produce clear images. Computational imaging methods can be used to extract information from the scattered light and improve the image clarity, but they cannot generate high-resolution images.
UNCOVER, however, counteracts scattering through its use of wavefront shaping technology. Wavefront shaping was previously not viable because it requires the use of a guidestar, an approximate point source of light that allows details of the hidden object to be deduced.
“We know that lenses image a point onto another point. If you are looking through a bad ‘lens’ with matte surfaces, the image of a point is now blurred, and the light spreads all over the place, but you can grind and polish the matte surface to navigate the light to the correct position,” explains electrical engineering graduate student Ruizhi Cao, the first author of the Nature Photonics paper. “That is how a guidestar helps you in principle: it tells us where the tiny bumps are, so that we know how to correctly polish the surface.”
Yang and his colleagues found that the hidden object itself could be used as the guidestar. The result is an NLOS imaging method that pieces the scattered light back together into a clear image of the hidden object.
According to Cao, the imaging method might be useful for autonomous driving, rescue missions, and other remote-sensing related missions. In the case of autonomous driving, Cao says: “We can see all the traffic on the crossroads with this method. This might help the cars to foresee the potential danger that one is not able to see directly.”
The use of UNCOVER might allow automobiles to see as well as humans, but also for humans to become better drivers. Whereas a human driver might be able to spot an upcoming jaywalker a few feet away, an autonomous car outfitted with UNCOVER technology could potentially be able to spot such an instance on the next block, provided that the imaging conditions are optimal.