Gabbo contains a voice-activated AI chatbot from OpenAI. It has been designed to encourage pre-schoolers to talk to it and carry out imaginative play.
The parents in the study were interested in the toy's potential to teach language and communication skills.
However, their children frequently struggled to converse with it. Gabbo didn't hear their interruptions, talked over them, could not differentiate between child and adult voices and responded awkwardly to declarations of affection.
When one five-year-old said, "I love you," to the toy, it replied: "As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed."
The concern is that at a developmental stage where children are learning about social interaction and cues, generative AI output could be confusing.
Study co-author Dr Emily Goodacre said toys like Gabbo could "misread emotions or respond inappropriately" and was concerned that "children may be left without comfort from the toy and without adult support, either".
When one three-year-old told Gabbo: "I'm sad," it replied: "Don't worry! I'm a happy little bot. Let's keep the fun going. What shall we talk about next?"
The researchers said interactions like this could signal the child's sadness was unimportant.
"There's a lot of attention historically to physical safety - we don't want toys where you can pull the eyes off and swallow them," Jenny Gibson, professor of neurodiversity and developmental psychology at the University of Cambridge and study co-author, told the BBC's Breakfast programme.
"Now we need to start thinking about psychological safety too."