Surface Optics & Neural Based Processing: MetaOptics

An international team of researchers at Princeton University and the University of Washington has developed a micro-sized camera to the size of a salt grain.

The most potential use of an ultracompact camera is within human body where the technology will allow sensing and capturing spot problems via nano bots. Current devices capture fuzzy, distorted images with limited fields of view however, the new system is developed to take this problem head-on.

Micro-sized camera

The tiny device can produce sharp and full-color images at the same level with a traditional compound camera lens, that is somewhere around 500,000 times larger in volume.

The system will help medical nano bots to diagnose and treat disease with not only minimum invasive endoscopy but also improve imaging for other robots with size and weight constraints. Arrangements of such camera in large numbers can be used for

  • full scene sensing and
  • turning entire surface into cameras  

Metasurface tech

To focus light, conventional cameras make use of curved glass or plastic lens, the current optical system however, is based on metasurface tech.

The metasurface is embedded with 1.6 million cylindrical posts. Each post is approximately 100 nm in diameter, that is nearly the size of the human immunodeficiency virus (HIV). Being just a millimeter wide, the arrangement resembles a computer chip.

Single post acts like an optical antenna. To get an optimum shape of entire optical wavefront, each post is molded with unique geometry. Machine learning-based algorithms are bridged between posts and its interactions with light to give out the highest-quality images and widest field of view for a full-color metasurface camera developed to date.

Surface optical technology and neural-based processing

Integrated design of the optical surface and the signal processing algorithms are the two key components that are responsible for enhancing camera’s performance in natural light. To create high-quality images, earlier metasurface cameras made use of suitable conditions which were artificially created and under laser light in labs, which is totally opposite to the innovative micro-sized camera.

Images produced by the new system is more than 500,000 times larger in volume with respect to the previous metasurface camera. Designing and configuring the little microstructures is not an easy task. Old metasurface lenses suffered in other areas like:

  • image distortions
  • small fields of view
  • limited ability to capture the full spectrum of visible light

The new optical system is able to transcend these hurdles, thus giving an edge to the optical functional element.

Metasurface physical structure

The metasurfaces are based on silicon nitride. The glassy amorphous material is used as a high-temperature structural component for automotive engines, gas turbines and combustor parts. In addition, it is also compatible with standard semiconductor manufacturing methods used for computer chips. Thus, it can be mass produced at lower cost than the lenses in conventional cameras.

Computational simulator to automate testing

Co-lead author Shane Colburn added that they created a computational simulator to automate testing of different nano-antenna configurations. Since there are large number of antennas therefore, tracking complexity of their interactions with light would require huge amount of memory and time. So, to track metasurfaces’ image production capabilities with precision computational simulator would be their best option.

Takeaway

With its object detection and sensing modalities, the revolutionary optical system will give immediate boost to tiny sensors that are used in nearly every field, like medical imaging, commodity smartphones, security, robotics, and autonomous driving.

The ultra-compact cameras would open up new capabilities in in endoscopy and brain imaging. The resulting image with the new method will be crisp and clear image than hazy. The innovative neural nano-optics imager will surely close the performance gap.   

Via: Princeton University

Explore further