- The French Open will use AI technology to filter out abusive and hateful comments on social media platforms during the tournament.
- The software, created by Bodyguard.ai, can identify and filter out racist, homophobic, and other hate speech and will be available to all athletes participating in the tournament.
- The use of AI technology to combat cyberbullying is a positive step towards protecting the mental health and well-being of athletes and is a reminder of the importance of being mindful of online communication and its impact on others.
PARIS, France – The French Open has announced that it is adopting AI-powered technology to prevent cyberbullying against its tournament players.
In a statement from the organizers, they said that they are partnering with a leading technology company to provide a new tool that can detect and eliminate harmful messages targeting the tournament’s players across social media platforms.
The technology identifies abusive language and other harmful content and flags the posts or messages for review and removal.
Gabriela Veira, the tournament’s head of communications, said that the implementation of this technology is vital, given the increasing prevalence of cyberbullying in the sport industry.
“Professional tennis players are targets of cyberbullying, which can be extremely damaging to their physical and mental well-being,” Veira said.
The tool will be active throughout the tournament, and officials will monitor the social media activity closely.
The French Open is the latest in a series of sports events that have taken steps to address the issue of harassment and cyberbullying. In 2020, the English Premier League also introduced similar technology to counter online attacks on players.
Veira hopes that this will mark a new era in the sports industry, where players can feel safe from online harassment both on and off the court.
“We hope that this technology is going to provide a safer space for our players and that it is going to help us eradicate cyberbullying from the game,” Veira concluded.