Participants
Participants were 14 adult chimpanzees (Pan troglodytes spp.) aged 10 to 33 years (\(\bar{X}\) = 22.57, SD = 8.06), including 10 males and 4 females (see Table 1). The chimpanzees had lived at the Fundació Mona (Spain) for between 1 and 12 years (\(\bar{X}\) = 7.36, SD = 4.34), after being confiscated or rescued from the pet and entertainment industry to be permanently housed at the centre. All chimpanzees were socially housed in two stable groups within a naturalistic enclosure designed to promote species-typical behaviours. The groups had access to both indoor and outdoor areas and were provided with daily environmental enrichment. No individuals were housed in isolation. Although some chimpanzees initially exhibited abnormal and anxiety-like behaviours, such as stereotypies or overgrooming, previous studies on their rehabilitation process83 have shown that desirable behaviours and welfare indices increased over time, while undesirable behaviours decreased.
Table 1 Biographical information on the chimpanzees at time of participation in the study. Full size table
Ethical considerations
All experimental procedures were non-invasive and complied with the ethical guidelines of the Animal Behaviour Society, which establish the standard and safe Guidelines for the Use of Animals in Research. The study was reviewed and approved by the Ethics Board of Fundació Mona and the Psychology Department Research Ethics Committee of City St. George’s University of London (SREC 14–15 01 CA 16 06 2015).
Design and stimuli
The study used a within-subjects design, with an independent variable of exposure to an android performing one of three facial expressions: Yawning, Gaping, or a neutral expression (closed mouth) (see Fig. 1a). A human-like android was designed with realistic biological features and motion dynamics. The android measured 45 cm in height, 20 cm in width, and weighed 3.8 kg. Thirty three servo motors were integrated to generate controlled facial movements. When powered on, the android’s neutral expression corresponded to the Close condition (mouth closed, no movement). All programmed facial movements lasted 10 s from onset to offset. To ensure precise and consistent movement, the servos were programmed as follows: (1) Close condition (neutral expression), nine servos maintained the neutral face, ensuring that no unintentional expressive movement occurred; (2) Gape condition (Non-yawning mouth opening). Twelve servos (two on each side above the mouth and two on each side on the lower part of the mouth) controlled the mouth movement, opening it to a maximum of 1.5 cm, sustaining the expression for 6 to 8 s, and then closing the mouth. The remaining active servos acted as support for the rest of the face to remain static. The entire cycle lasted 10 s. Eight mini servos around the eyebrow regions were designed to exemplify the corrugator muscle movement, which forms part of the yawning expression. Finally, the (3) Yawn condition required 6 mini servos to create the internal space necessary for the movement command. These “space facilitator servos”, were placed in the back of the cheek area to maintain the facial structure in the same position and therefore prevent the portrayal of more than one expression at the same time. The android’s mouth opened to a maximum of 5.5 cm, mimicking air intake, while the eyes closed and reopened as the mouth closed.
All motion parameters (e.g., time, speed, trajectory, velocity, and muscle simulation motion pattern) were programmed and automatically adjusted using C/C++, Python, Java, and MATLAB. The movements were designed to replicate human facial biological motion, maintaining smooth, human-like transitions while ensuring that each action adhered to the 10-second duration limit.
Although the android’s silicone facial layers closely resembled human skin, some inherent textural differences remained. Nevertheless, a transparent rear panel revealed its internal mechanical components (See Fig. 1a), making its artificial nature explicit, despite its otherwise realistic human-like appearance when viewed frontally.
During the Close condition, the android remained expressionless with its mouth closed and lips sealed for the entire 5-minute presentation phase. In the Gape condition, the model performed non-yawning mouth openings at regular intervals. In the Yawn condition, the model displayed full yawns as described above. During all conditions, the android was positioned within the chimpanzee’s “full visual field” (approximately 0–45°) or “peripheral visual field” (45–110°) relative to the sagittal plane of the participant’s eyes. The actions were repeated for 5 min, occurring a minimum of 15 times and a maximum of 20 times per condition. The experimenter, hidden behind a screen, remotely controlled the android’s actions via a button-operated remote panel (Fig. 2).
... continue reading