Facial expressions are one of the most important sources of information about another's emotional states. More recently, other cues such as body posture have been shown to influence how facial expressions are perceived. It has been argued that this biasing effect is underpinned by an early integration of visual information from facial expression and body posture. Here, we replicate this biasing effect, but, using a psychophysical procedure, show that adaptation to facial expressions is unaffected by body context. The integration of face and body information therefore occurs downstream of the sites of adaptation, known to be localised in high-level visual areas of the temporal lobe. Contrary to previous research, our findings thus provide direct evidence for late integration of information from facial expression and body posture. They are consistent with a hierarchical model of social perception, in which social signals from different sources are initially processed independently and in parallel by specialised visual mechanisms. Integration of these different inputs in later stages of the visual system then supports the emergence of the integrated whole-person percept that is consciously experienced.