Nature Lends Imaging an “Eye” Up
If necessity is the mother of invention, evolution may be the sibling of innovation
Evolution is the greatest of innovators. Over hundreds of millions of years, it has solved so many problems in so many different ways that increasingly, our technology employs biomimicry—emulating solutions found in nature. Machine vision is one of many technologies benefiting from biomimicry. Here are three recent examples.
A meeting of underwater eyes
Engineers from the University of Wisconsin-Madison recently combined characteristics of the eyes of lobsters and an African elephant-nosed fish to build an artificial eye that can see in the dark. Most technologies that improve the sensitivity of imaging systems enhance the sensors; this new approach improves lenses, increasing by four times the intensity of incoming light focused on the sensor.
Unlike the smooth-surfaced retinas of most animal eyes, the retinas of African elephant-nose fishes are comprised of thousands of tiny crystal cups. These collect and intensify red light, helping the fish evade predators. To emulate them, the researchers engineered thousands of miniscule parabolic mirrors, each as small as a grain of pollen.
Next, they placed arrays of the mirrors across the surface of a uniform hemispherical dome, mimicking the superposition compound eye of the lobster, which concentrates incoming light on individual spots, further increasing intensity.
And where might we “see” these deployed?
Potential applications for this bio-inspired camera include search-and-rescue and bomb-diffusing robots, laparoscopic surgery and high-powered telescopes, all of which must detect fine details in near or complete darkness.
Inside the mind of the mantis
The praying mantis is the only insect with binocular vision. To investigate the mechanisms of mantis vision, researchers at Newcastle University created a mantis movie theatre, complete with old school 3D glasses affixed to the mantises with beeswax and resin.
Then they showed the insects video of simulated prey within striking distance. With 2D footage the mantises didn’t react; with 3D footage they attacked, proving they could perceive the distance to their prey.
Now the researchers hope to learn how the mantis puts together the views from left and right eyes—which, despite having a brain less than the size of a pin, it does far better than today’s robots. This knowledge could lead to a computer algorithm that, by modeling mantis vision, substantially improves robot depth perception.
Nature’s drones find new paths
Bees have low-resolution vision and small brains yet they have no difficulty navigating through thick vegetation. How do they find gaps to fly through? Researchers at Sweden’s Lund University have discovered that the green orchid bee from Panama can detect increased light intensity from breaks in foliage. The greater the intensity, the larger the opening.
This approach is fast and computationally simple because the orchid bee is looking for patterns rather than details, and it could be used to give micro robot drones—which can be as small as 15 centimeters—similar navigational capabilities, allowing them to fly without human control. Researchers are working on mathematical models and digital systems to create a three-dimensional replication of what the orchid bee sees.