Trash talk has been part of sport and human competition for as long as people have been competitive, but now robots are getting in on the game.
Researchers programmed a robot called Pepper to use mild insults such as 'you are a terrible player' and 'your playing has become confused'.
It would then use these insults while challenging a human to a game called 'Guards and Treasures' that is designed to test rationality.
Even though the robot used very mild language, the human player's performance got worse while they were being insulted, according to lead author Aaron M. Roth.
The team say tests like this could help work out how humans will respond in future if a robot assistant disagrees with a command, such as over whether to buy healthy or unhealthy food.
Some of the participants had enough technical knowledge to understand the insults had been pre-programmed into the robot.
"One participant said, ‘I don’t like what the robot is saying, but that’s the way it was programmed so I can’t blame it,’" said Mr Roth.
But the researchers found that, overall, human performance changed in response to the robot's comments regardless of technical sophistication.
'This is one of the first studies of human-robot interaction in an environment where they are not cooperating," said Fei Fang, one of the report's authors.
She said it could have 'enormous implications' for a world where robots and internet of things devices with artificial intelligence is expected to grow exponentially.
'We can expect home assistants to be cooperative, but in situations such as online shopping, they may not have the same goals as we do.'
This could include taking our health needs into account over our desire for an unhealthy treat.
The team used a commercially available humanoid robot called Pepper that is produced by SoftBank Robotics.
Pepper is designed with the ability to read emotions and can be programmed by users to perform certain tasks, which the team say made it perfect for this study.
There were 40 participants and they each played the game 35 times with the Pepper while either being encouraged subjected to insults by the device.
Although the human players' score improved with the number of games they played, those who were criticised didn't score as well as those who were praised.
'It has already been shown that humans are affected by what other people say, but this study shows that humans also respond to what a machine says about them', according to Afsaneh Doryab, a researcher on the study.
'This machine's ability to prompt responses could have implications for automated learning, mental health treatment and even the use of robots as companions.'
The team hope to look at nonverbal expression between robots and humans as well as whether people would react differently to a humanoid robot than a computer box in future studies.