19 March 2009
New clues in understanding face perception
by Kate Melville
Humans excel at recognizing faces but neuroscience can't say how we accomplish it. Now, in an effort to explain our success in this area, researchers are taking a closer look at how and why we fail at face recognition. Specifically, our impaired ability to recognize faces in photographic negatives. The study, appearing in the Proceedings of the National Academy of Sciences, suggests that a large part of the answer might lie in the brain's reliance on a certain kind of image feature.
The work may eventually lead to advanced computer vision systems and the results and methodologies could help researchers probe face-perception skills in children with autism, who are often reported to experience difficulties analyzing facial information.
Those of us who were around before digital photography will remember that it's much harder to identify people in photographic negatives than in normal photographs. "You have not taken away any information, but somehow these faces are much harder to recognize," says MIT's Pawan Sinha, an associate professor of brain and cognitive sciences and senior author of the study.
Sinha has previously studied light and dark relationships between different parts of the face, and found that in nearly every normal lighting condition, a person's eyes appear darker than the forehead and cheeks. He theorized that photo negatives are hard to recognize because they disrupt these very strong regularities around the eyes.
To test this idea, Sinha asked experimental subjects to identify photographs of famous people in not only positive and negative images, but also in a third type of image in which the celebrities' eyes were restored to their original levels of luminance, while the rest of the photo remained in negative.
It turned out that the subjects had a much easier time recognizing these "contrast chimera" images. According to Sinha, that's because the light/dark relationships between the eyes and surrounding areas are the same as they would be in a normal image. Similar contrast relationships can be found in other parts of the face, primarily the mouth, but those relationships are not as consistent. "The relationships around the eyes seem to be particularly significant," says Sinha.
Other studies have shown that people with autism tend to focus on the mouths of people they are looking at, rather than the eyes, so the new findings could help explain why autistic people have such difficulty recognizing faces, Sinha believes.
The findings also suggest that neuronal responses in the brain may be based on these relationships between different parts of the face. The team found that when they scanned the brains of people performing the recognition task, regions associated with facial processing (the fusiform face areas) were far more active when looking at the contrast chimeras than when looking at pure negatives.
Source: Massachusetts Institute of Technology