In primate societies, survival and success depend on the appropriate decoding and production of social signals. Primates are able to encode and decode facial signals, informing their own and others’ emotions. Very little is known about the neural circuits mediating emotions and the control of facial movements. In contrast, a lot is known about how the neural circuits process incoming visual information about faces. In macaques, the face-processing neural network consists of six interconnected face-selective regions within the temporal lobe. To achieve a motor outcome—e.g., a smiling face—the brain needs to link sensory activity to the emotional and motor systems. For example, when a smiling face is perceived, depending on the observer’s current emotional state, the stimulus might elicit in that observer a state of happiness and thus a smile. For this, the flow and gate of information across distinct brain areas is crucial. Therefore, we study the flow of facial sensory information, how it is gated in a context-dependent manner across cortical and subcortical areas, and how this flow and gating elicits specific facial motor expressions. Specifically, we aim to assess how sensory information from the face-processing network is gated to elicit distinct motor facial expressions.