MIT researchers have developed a chip that leverages sub-terahertz wavelengths for object recognition, which could be combined with light-based image sensors to help steer driverless cars through fog. Image courtesy of the researchers
On-chip system that detects signals at sub-terahertz wavelengths could help steer driverless cars through fog and dust. Autonomous vehicles relying on light-based image sensors often struggle to see through blinding conditions, such as fog. But MIT researchers have developed a sub-terahertz-radiation receiving system that could help steer driverless cars when traditional methods fail. Sub-terahertz wavelengths, which are between microwave and infrared radiation on the electromagnetic spectrum, can be detected through fog and dust clouds with ease, whereas the infrared-based LiDAR imaging systems used in autonomous vehicles struggle. To detect objects, a sub-terahertz imaging system sends an initial signal through a transmitter; a receiver then measures the absorption and reflection of the rebounding sub-terahertz wavelengths. That sends a signal to a processor that recreates an image of the object. But implementing sub-terahertz sensors into driverless cars is challenging.
TO READ THIS ARTICLE, CREATE YOUR ACCOUNT
And extend your reading, free of charge and with no commitment.