Individuals with autism spectrum disorders demonstrate a pattern of brain activity during face discrimination that is consistent with feature-based strategies that are more typical of nonface object perception
The crucial factor that impairs the face-processing strategy in autism, while leaving facial recognition abilities intact, is the presence of emotional and social information. In recent years, functional neuroimaging and biochemical studies have elucidated the neuroanatomical structures active during emotional and social perception. Moreover, differences have been localized and qualified that distinguish healthy from autistic participants. The neural substrates of emotion have been identified by presenting affective stimuli while neuroimaging techniques record regional activity. Facial expressions are the most widely utilized form of affective stimuli, for they arguably carry the greatest social and emotional import.
People with autism have a different pattern of brain activity when looking at faces compared with people without the disorder.
Friday, May 05,2000 All Things Considered
CVI affects how an individual understands the visual information received by the eyes. The location and extent of the brain insult determines which functional behaviors will be effected.
People suffering from prosopagnosia, however do not exhibit deficits in distinguishing between faces and other stimuli, but in attributing meaning to faces.
Prosopagnosia is a deficit in face recognition that can be limited to recognition of either previously familiar faces or unfamiliar faces.
Results support the usefulness of the program to teach the detection of facial affect. However, the improvement found is limited to a circumscribed area of social-communicative function and generalization is not ensured.
Recent neuroimaging studies in adults indicate that visual areas selective for recognition of faces can be recruited through expertise for nonface objects. This reflects a new emphasis on experience in theories of visual specialization. In addition, novel work infers differences between categories of nonface objects, allowing a re-interpretation of differences seen between recognition of faces and objects. Whether there are experience-independent precursors of face expertise remains unclear; indeed, parallels between literature for infants and adults suggest that methodological issues need to be addressed before strong conclusions can be drawn regarding the origins of face recognition.
We propose a model for the organization of this system that emphasizes a distinction between the representation of invariant and changeable aspects of faces.
Functional magnetic resonance imaging (fMRI) was used to compare brain activation to static facial displays versus dynamic changes in facial identity or emotional expression. Static images depicted prototypical fearful, angry and neutral expressions. Identity morphs depicted identity changes from one person to another, always with neutral expressions. Emotion morphs depicted expression changes from neutral to fear or anger, creating the illusion that the actor was 'getting scared' or 'getting angry' in real-time. Brain regions implicated in processing facial affect, including the amygdala and fusiform gyrus, showed greater responses to dynamic versus static emotional expressions, especially for fear. Identity morphs activated a dorsal fronto-cingulo-parietal circuit and additional ventral areas, including the amygdala, that also responded to the emotion morphs. Activity in the superior temporal sulcus discriminated emotion morphs from identity morphs, extending its known role in processing biologically relevant motion. The results highlight the importance of temporal cues in the neural coding of facial displays.
Responsiveness of N200 and related ERPs to the perceptual features of faces and other images was assessed. N200 amplitude did not vary substantially, whether evoked by colored or grayscale faces; normal, blurred or line-drawing faces; or by faces of different sizes. Human hands evoked small N200s at face-specific sites, but evoked hand-specific ERPs at other sites. Cat and dog faces evoked N200s that were 73% as large as to human faces. Hemifield stimulation demonstrated that the right hemisphere is better at processing information about upright faces and transferring it to the left hemisphere, whereas the left hemisphere is better at processing information about inverted faces and transferring it to the right hemisphere. N200 amplitude was largest to full faces and decreased progressively to eyes, face contours, lips and noses viewed in isolation. A region just lateral to face-specific N200 sites was more responsive to internal face parts than to faces, and some sites in ventral occipitotemporal cortex were face-partspecific. Faces with eyes averted or closed evoked larger N200s than those evoked by faces with eyes forward. N200 amplitude and latency were affected by the joint effects of eye and head position in the right but not in the left hemisphere. Full and three-quarter views of faces evoked larger N200s than did profile views. The results are discussed in relation to behavioral studies in humans and single-cell recordings in monkeys.
In this context of high certainty about normal functional patterns, Schultz et al7 conducted the first neuroimaging study to test whether autism involves abnormal neurofunctional activation during face processing.
The present results support the prediction that face re-cognition is enhanced when targets display direct rather than deviated gaze.
People who are tone deaf are not deaf to tones. They can hear tones, they just can't tell them apart. People who are color blind can see things that are in color. They just can't tell colors apart. Similarly, I can see faces. I just can't tell them apart.
Now a team led by Thomas Gruter at the Institute for Human Genetics in Münster, Germany, who is a prosopagnosic himself, has found concrete evidence of its genetic basis. "I realised I had prosopagnosia quite early on in school," Gruter says. He has trouble recognising faces of people he knows and sometimes thinks he recognises strangers.
Most people identify people by remembering other people's faces. People with face blindness don't seem to be able to do this, and thus must rely on other physical traits.
Most adolescents with autism form a normal configuration-based face representation, but the absence of the composite effect indicates that they ageless prone to use the contextual information of the face in a visual-search.
Impairments in face processing are well documented in autism. Such impairments are part of a more general dysfunction in complex social brain circuitry in persons with autism. Because neural systems underlying face processing are present early in life, impairments in face processing likely represent a dysfunction of these very early brain systems. Typical neonates exhibit a visual preference for faces and rapid face recognition. At the age of 6 months, typically developing infants demonstrate specific brain responses, as measured by event-related potentials (ERPs), to familiar vs unfamiliar faces and to fearful vs neutral facial expressions. These early face-processing skills are important for learning to interpret emotional expression and share attention with others.
It seems that only with very limited time resources, local identification processes are particular beneficial for the recognition of faces. If there are no such time constraints, then holistic face processing strategies seem to be more advantageous.
Autistic 3- and 4-year-olds respond to pictures of familiar toys, but not to photographs of their mothers. The findings indicate that facial recognition tests could identify autism at younger ages.
There are lots of implications other than embarrassment. I can't picture the faces of people who have passed on. I see pictures and I remember, but I can't conjure up the image in my mind without that visual prompt. That worries me.
If you ever wonder whether I (being face-blind) would be able to recognize you or some other person; Consider if you could learn to recognize a stone in that situation. If you could, there is a fair chance that I would be able to recognize a human.
Face blindness is a condition in which a person has great difficulty remembering faces. Anyone interested in this can post here: people with face blindness, relatives of them, people working in the field, or people that are just interested.
Neuroimaging findings of individuals with autism and Asperger's syndrome indicate dysfunction not only within neural systems important for face processing but also within regions underlying higher-order cognitive functions, including theory of mind, the ability to understand others' mental states.
Facial prosopagnosia (PA) is a common but largely unknown dysfunction, according to the results of a study reported in the June 30 Early View issue of the American Journal of Medical Genetics Part A. "Acquired prosopagnosia (PA) is a rare condition after, for example, a stroke or brain injury," write Ingo Kennerknecht, MD, from Westfälische Wilhelms-Universität in Münster, Germany, and colleagues. "The congenital form of PA is generally considered to be even less common. Beside a few single case reports and anecdotal mentioning of familial cases no data on the epidemiology exists." The investigators administered a questionnaire for screening in local secondary schools and at their medical facility to identify possible candidates with PA. These candidates underwent a semistructured interview followed by examinations of first-degree relatives.
The cortical activation pattern during face processing indicates deficits in the face-specific regions, with higher activations in regions involved in visual search. These findings reflect different strategies for visual processing, supporting models that propose a predisposition to local rather than global modes of information processing in autism.
Subjects with autistic disorder differed significantly from controls in the activity of cerebellar, mesolimbic and temporal lobe cortical regions of the brain when processing facial expressions. Notably, they did not activate a cortical `face area' when explicitly appraising expressions, or the left amygdala region and left cerebellum when implicitly processing emotional facial expressions. High-functioning people with autistic disorder have biological differences from controls when consciously and unconsciously processing facial emotions, and these differences are most likely to be neurodevelopmental in origin.
Diminished gaze fixation is one of the core features of autism and has been proposed to be associated with abnormalities in the neural circuitry of affect. We tested this hypothesis in two separate studies using eye tracking while measuring functional brain activity during facial discrimination tasks in individuals with autism and in typically developing individuals. Activation in the fusiform gyrus and amygdala was strongly and positively correlated with the time spent fixating the eyes in the autistic group in both studies, suggesting that diminished gaze fixation may account for the fusiform hypoactivation to faces commonly reported in autism. In addition, variation in eye fixation within autistic individuals was strongly and positively associated with amygdala activation across both studies, suggesting a heightened emotional response associated with gaze fixation in autism.
These results suggest that face-processing deficits can be found across different input modalities. Our findings also extend the notion of configural processing to haptic face and object recognition.
Face recognition abnormalities in autism are not fully explained by an impairment of holistic face processing, and that there is an unusual significance accorded to the mouth region when children with autism process information from people's faces.
these data suggest that regions of temporal cortex actively integrate form and motion information, a process largely independent of low-level visual processes such as changes in local luminance and contrast.
We give a somewhat idiosyncratic history of the development of neural network models of face processing, concentrating on work at UCSD, and show how these models have led to a novel hypothesis concerning processing of facial expression.
People show a left visual field (LVF) bias for faces, i.e., involving the right hemisphere of the brain. Lesion and neuroimaging studies confirm the importance of the right-hemisphere and suggest separable neural pathways for processing facial identity vs. emotions. We investigated the hemispheric processing of faces in adults with and without Asperger syndrome (AS) using facial emotion and identity chimeric tasks. Controls showed an LVF bias in both tasks, but no perceptual bias in a non-social control task. The AS group showed an LVF bias during both tasks, however the bias was reduced in the identity condition. Further, the AS group showed an LVF bias in the non-social condition. These results show a differential pattern of hemispheric processing of faces in AS.
Face blindness causes difficulty recognizing people, severely impacting social interaction... There are many ways of coping with face blindness, some of which I'll describe below. Some of these methods may be appropriate for parents of face-blind children, while others will be useful face blind people themselves, in work or adult social situations.
Lincoln Holmes can see perfectly well - but he cannot recognise his own face.
Though I clearly view persons in my visual field my brain fails to 'store' most faces in memory thus I usually don't remember a face after I see it.
Brain imaging research has identified at least two regions in human extrastriate cortex responding selectively to faces. One of these is located in the mid-fusiform gyrus (FFA), the other in the inferior occipital gyrus (IOG). We studied activation of these areas using fMRI in three individuals with severely impaired face recognition (one pure developmental and two childhood prosopagnosics). None of the subjects showed the normal pattern of higher fMRI activity to faces than to objects in the FFA and IOG or elsewhere. Moreover, in two of the patients, faces and objects produced similar activations in the regions corresponding to where the FFA and IOG are found in normal subjects. Our study casts light on the important role of FFA and IOG in the network of areas involved in face recognition, and indicates limits of brain plasticity.
Neural activity was measured in 10 healthy volunteers by functional MRI while they viewed familiar and unfamiliar faces and listened to familiar and unfamiliar voices. The familiar faces and voices were those of people personally known to the subjects; they were not people who are more widely famous in the media. Changes in neural activity associated with stimulus modality irrespective of familiarity were observed in modules previously demonstrated to be activated by faces (fusiform gyrus bilaterally) and voices (superior temporal gyrus bilaterally). Irrespective of stimulus modality, familiarity of faces and voices (relative to unfamiliar faces and voices) was associated with increased neural activity in the posterior cingulate cortex, including the retrosplenial cortex. Our results suggest that recognizing a person involves information flow from modality-specific modules in the temporal cortex to the retrosplenial cortex. The latter area has recently been implicated in episodic memory and emotional salience, and now seems to be a key area involved in assessing the familiarity of a person. We propose that disturbances in the information flow described may underlie neurological and psychiatric disorders of the recognition of familiar faces, voices and persons (prosopagnosia, phonagnosia and Capgras delusion, respectively).
Racial equality is poorly served by insisting that differences between groups don't exist. Even a seemingly elementary matter like identifying faces involves a subtle interplay between what we see and what we perceive.
The three-year study resulted in discovery of reduced activation in the fusiform gyrus - the classic face area of the cerebral cortex. Researchers also observed increased activation in an adjacent region of the brain that processes non-face objects.
It appears that densely prosopagnosic participants may have some covert ability to detect familiarity in a direct test. Affective attitude rather than familiarity may be the stronger influence on responses.
Patterns in response to facial expressions support the notion of involved neural substrates for processing different facial expressions.
Because prosopagnosia is not a unitary disorder (different people may show different levels of impairment) it has been argued that face perception involves a number of stages, each of which can be separately damaged
Prosopagnosia, or face-blindness, is a neurological condition that renders a person incapable of recognizing faces. It is unrelated to the person's ability to see faces. This page is a face-blind layman's attempt at explaining prosopagnosia.
Prosopagnosia is a disorder of face perception where the ability to perceive and understand faces is impaired, although other basic perceptual skills (such as recognising and decimating objects) may be relatively intact.
'Though I clearly view persons in my visual field my brain fails to 'store' most faces in memory thus I usually don't remember a face after I see it. My world, it seems, consists mostly of strangers.'
Research in our lab has focused on studying face perception using behavioral methods (i.e., face recognition tests), as well as noninvasive brain techniques such as EEG recording and magnetic resonance imaging (MRI).
Face-specific N170 components were consistently elicited in 24 control subjects, but were absent in prosopagnosic patient PHD.
Prosopagnosia is a rare symptom which can be observed income patients in addition to their other typical features of AS. However, if investigators are aware of the possibility of prosopagnosia it is easy to detect.
If you believe that you are prosopagnosic or have other types of recognition impairments and are interested in becoming involved with research, please contact us using our form.
There may be a face pathway in the occipitotemporal region. Prosopagnosia, depending on the nature of the damage, can involve facial recognition impairments in general, not just inability to recognize familiar faces.
Some children have trouble remembering a new face, and researchers in Canada may have an uncommon explanation for that. They studied 14 children with a rare condition called nonverbal learning disability. The condition affects less than 1% of the public, and most affected kids generally do well on tests, except for those that involve recognizing and working with shapes. The condition appears to last a lifetime, but there are ways of teaching people to help them improve their visual spatial processing, states the news release. Problems with these processes include difficulties such as drawing -- copying as well as drawing to verbal command -- and assembling two and three dimensional objects.
When people with autism look at a face, the brain area that responds to that information is activated in a way that's very similar to the brain activity of people without autism, new research shows. This finding comes as a surprise, since it's widely recognized that people with autism tend to avoid looking directly at other people's faces. The result also contradicts previous research that found that the face-processing area in the back of the brain is under-responsive in people with autism.
The present study was undertaken in order to determine whether a set of clinical features, which are not included in the DSM-IV or ICD-10 for Asperger Syndrome (AS), are associated with AS in particular or whether they are merely a familial trait that is not related to the diagnosis... An aberrant processing of sensory information appears to be a common feature in AS. The impact of these and other clinical features that are not incorporated in the ICD-10 and DSM-IV on our understanding of AS may hitherto have been underestimated. These associated clinical traits may well be reflected by the behavioural characteristics of these individuals.
The finding is surprising, as it is widely known that autistic individuals tend to avoid looking directly at faces. The research also counters previous published reports that the face-processing area at the back of the brain is under-responsive in people with autism, and it suggests that specific behavioral interventions may help people with autism improve their ability to interact socially. The study involved functional magnetic resonance imaging, or fMRI. Unlike standard MRI scans that show anatomical structures in black and white, fMRI offers digitally enhanced color images of brain function, depicting localized changes in blood flow and oxygenation.
The study investigated the recognition of standardized facial expressions of emotion (anger, fear, disgust, happiness, sadness, surprise) at a perceptual level (experiment 1) and at a semantic level (experiments 2 and 3) in children with autism (N= 20) and normally developing children (N= 20). Results revealed that children with autism were as able as controls to recognize all six emotions with different intensity levels, and that they made the same type of errors. These negative findings are discussed in relation to (1) previous data showing specific impairment in autism in recognizing the belief-based expression of surprise, (2) previous data showing specific impairment in autism in recognizing fear, and (3) the convergence of findings that individuals with autism, like patients with amygdala damage, pass a basic emotions recognition test but fail to recognize more complex stimuli involving the perception of faces or part of faces.
Our modified account more clearly shows genuine face recognition is more consistent with existing concepts of the IAC model, addresses the relevant clinical phenomena, and requires fewer additional assumptions.
Alicia's work aims to identify whether children with autism have specific difficulty in differentiating between facial features or facial expressions, which may affect the quality of their social interactions.
This article reviews behavioral and electrophysiological studies of face processing and discusses hypotheses for understanding the nature of face processing impairments in autism. Based on results of behavioral studies, this study demonstrates that individuals with autism have impaired face discrimination and recognition and use atypical strategies for processing faces characterized by reduced attention to the eyes and piecemeal rather than configural strategies. Based on results of electrophysiological studies, this article concludes that face processing impairments are present early in autism, by 3 years of age. Such studies have detected abnormalities in both early (N170 reflecting structural encoding) and late (NC reflecting recognition memory) stages of face processing. Event-related potential studies of young children and adults with autism have found slower speed of processing of faces, a failure to show the expected speed advantage of processing faces versus nonface stimuli, and atypical scalp topography suggesting abnormal cortical specialization for face processing. Other electrophysiological studies have suggested that autism is associated with early and late stage processing impairments of facial expressions of emotion (fear) and decreased perceptual binding as reflected in reduced gamma during face processing. This article describes two types of hypotheses-cognitive/perceptual and motivational/affective--that offer frameworks for understanding the nature of face processing impairments in autism. This article discusses implications for intervention.
Collectively, these results indicate disorganized processing of face stimuli in autistic individuals and suggest a mechanism that may subserve the social information processing deficits that characterize autism spectrum disorders.