Abstract
Background: Williams syndrome (WS) and Autism Spectrum Disorders (ASD) are psychiatric conditions associated with atypical but opposite face-to-face interactions patterns: WS patients overly stare at others, ASD individuals escape eye contact. Whether these behaviors result from dissociable visual processes within the occipito-temporal pathways is unknown. Using high-density electroencephalography, multivariate signal processing algorithms and a protocol designed to identify and extract evoked activities sensitive to facial cues, we investigated how WS (N=14), ASD (N=14) and neurotypical subjects (N=14) decode the information content of a face stimulus. Results: We found two neural components in neurotypical participants, both strongest when the eye region was projected onto the subject's fovea, simulating a direct eye contact situation, and weakest over more distant regions, reaching a minimum when the focused region was outside the stimulus face. The first component peaks at 170ms, an early signal known to be implicated in low-level face features. The second is identified later, 260ms post-stimulus onset and is implicated in decoding salient face social cues.Remarkably, both components were found distinctly impaired and preserved in WS and ASD. In WS, we could weakly decode the 170ms signal based on our regressor relative to facial features, probably due to their relatively poor ability to process faces’ morphology, while the late 260ms component was highly significant. The reverse pattern was observed in ASD participants who showed neurotypical like early 170ms evoked activity but impaired late evoked 260ms signal. Conclusions: Our study reveals a dissociation between WS and ASD patients and point at different neural origins for their social impairments.