Human eyes adjust effortlessly to darkness. A few minutes after stepping into a dim room or driving through a tunnel at night, vision sharpens, objects come into focus, and movement is easier to track. This ability comes from a combination of immediate sensitivity and short-term memory built into the retina itself. Rod cells detect weak light. Neurons store patterns. Together, they let the brain build a coherent picture even in near darkness.
Artificial vision systems do none of this. Cameras capture light, but they rely on separate memory and processing units to interpret it. In dim conditions, this pipeline breaks down. Signals get noisy. Processing lags. And without a way to remember what was just seen, tracking motion or recognizing shapes becomes unreliable. The result is a major weakness in technologies that need to see in the dark, from autonomous vehicles to low-power robotics and surveillance.
Solving this problem requires more than just better sensors. It requires hardware that behaves more like a retina, adapting to weak light while storing and processing visual information locally. The field of retinomorphic vision aims to build such systems by mimicking biological principles in electronic devices. But one of the biggest technical barriers has remained unresolved.
Even the most light-sensitive materials, such as quantum dots, struggle to generate usable signals in low-light environments because the charges they produce remain locked together and don’t travel. Without charge separation, there’s no current to store, no memory to form, and no adaptation to achieve.
A study published in Advanced Materials ("Ferroelectric Quantum Dots for Retinomorphic In‐Sensor Computing") presents a solution to this problem. The researchers developed ferroelectric quantum dots that combine strong light absorption with built-in electric fields. These fields help separate photo-generated charges, enabling a new kind of device that can detect, adapt to, and remember visual information in real time and low-light conditions.
To read more, click here.