Researchers at the Massachusetts Institute of Technology (MIT) have developed a way to produce images of objects hidden in thick fog and also judge their distance in the process.
Fog has consistently been an issue for autonomous vehicles because many driverless cars take advantage of visible light to guide their various navigation systems.
Most autonomous vehicles utilize technology that features a time-of-flight camera that measures distance based on the speed of light. Usually these systems are capable of processing this data accurately. On a foggy day though, light is scattered by water particles present in the air, causing the technology to become significantly less accurate.
The MIT researchers used statistics to solve the problem and were able to compensate for the reduction in accuracy caused by fog. A pattern called gamma distribution corresponded with the light reflections caused by the water particles. This was able to make up for the thickness of the fog in order to ensure autonomous vehicles are still capable of driving accurately.
The researcher’s system could even use the information to remove the reflections of water droplets from the equation entirely.
MIT News gives a deeper description of the technology and how it works.