COLUMBIA, S.C. (AP) — An NCAA study on social media abuse of athletes, coaches and other officials during championship events found nearly one in five posts that were flagged by an AI-based algorithm and determined to be abusive involved sexual harassment and 12% were related to sports betting, according to results of the pilot study released Thursday.
The college athletics governing body’s findings of its first online harassment study using Signify Group’s Threat Matrix examined more than 72,000 messages flagged by an algorithm. Over 5,000 of those posts were confirmed to contain abusive, discriminatory or threatening content and were reported to social media companies.
The study conducted during 2023-2024 examined social media posts related to championship-level events in six sports: baseball, basketball, gymnastics, football, softball and volleyball.
Of the abusive posts, the study found 80% were directed at March Madness athletes, with female basketball players receiving about three times more abusive messages than their male counterparts.
The study cited one unidentified athlete who received more than 1,400 harassing messages in a two-week span.
“The risks and mental health challenges associated with being a victim of online abuse or threats are real and have a direct and immediate effect on athletes, coaches, officials, and their families,” the NCAA wrote in its report. “This can impact them on both a personal and professional level, and ultimately affect their wellbeing and ability to perform at their best.”
The NCAA said sports-betting harassment was spread across all the championships covered in the study.
Racial comments made up about 10% of the abusive messages studied, but the survey found the men’s and women’s NCAA basketball tournaments were a focus of such content.
“Toxic online fans resorted to racist mockery, comparing players to monkeys and labelling them as thugs,” the NCAA’s study found.
Women’s basketball players, teams and officials received such treatment, the NCAA said.
“The level of Dogwhistle content during the (basketball) Women’s championships should be highlighted as well in connection with racism,” the governing body said in the report.
The study monitored the accounts of 3,164 student-athletes, 489 coaches, 197 game officials, 165 teams and 12 NCAA official channels using Signify Group’s artificial intelligence Threat Matrix. It identified varying areas of online abuse and threats, and established 16 categories in which to organize messages that were deemed to be abusive. The NCAA told The Associated Press the algorithm’s flagging system was based on a series of issue-specific keywords and human analysts organized abusive messages into the categories.
Violence was found to be the subject of 6% of all verified abusive and threatening content, according to the study.
College football at the FBS level, the men’s basketball tournament and volleyball all received “high proportions of violent, abusive or threatening content.”
Other threatening messages were connected to homophobia and transphobia, doping and steroid use and match officials.
The NCAA said risks come across all sports, saying in some instances volleyball and gymnastics generated more “concerning” abusive messages than March Madness or the College Football Playoff.
The NCAA report said social media abuse and threats can have a significant effect on athletes and others involved in college sports.
Even if an athlete who is the target of such abuse says they are fine, “this should not be assumed to be the case,” NCAA said.
Such targeting can continue after a welfare check, “which is why action needs to be taken to protect them long-term,” the organization said.
NCAA president Charlie Baker said the study is evidence of what some athletes deal with as they go through their time in college.
“We will exhaust all options to reduce the harassment and vitriol student-athletes are experiencing too often today,” he said in a statement.