November 17, 2025
The GIST AI model predicts which short videos on major platforms could spark suicidal thoughts
Lisa Lock
scientific editor
Andrew Zinin
lead editor
Editors' notes
This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:
fact-checked
peer-reviewed publication
trusted source
proofread

A new study published in Information Systems Research finds that certain short-form videos on major platforms can trigger suicidal thoughts among vulnerable viewers and that a newly developed AI model can flag these high-risk videos before they spread. The research delivers one of the first data-driven, medically informed tools for detecting suicide-related harms in real time, giving platforms a clearer early-warning signal at a moment when youth mental-health concerns are rising and scrutiny of platform safety is intensifying.
The study was conducted by Jiaheng Xie of the University of Delaware, Yidong Chai of the Hefei University of Technology and City University of Hong Kong, Ruicheng Liang of Anhui University of Finance and Economics, Yang Liu of the Hefei University of Technology and Daniel Dajun Zeng of the Chinese Academy of Sciences.
Their work comes as short-form video use grows at staggering speed. Globally, 1.6 billion people consume short clips on TikTok, Douyin and similar platforms, yet experts have raised alarms about content that glamorizes or normalizes self-harm. Viewers often express emotional distress directly in the comment sections of these videos, giving platforms a real-time signal of harm.
"Our goal was to help platforms understand when a video might trigger suicidal thoughts and to catch those risks before they spread," said Xie. "The comments people leave are powerful indicators of how video content affects them, especially when viewers feel anonymous and more willing to share what they are feeling."
The research team developed a knowledge-guided neural topic model, a type of artificial intelligence that combines medical expertise about suicide risk factors with patterns found in real video content. The model predicts the likelihood that a new video will generate suicidal thought comments, allowing moderation teams to intervene before the video reaches wider audiences.
Unlike existing methods that treat all videos and comments the same, the model distinguishes between what creators choose to post and what viewers think or feel after watching. It also separates known medical risk factors from emerging social media trends, such as viral heartbreak clips or challenges that may influence teens.
"Short-form videos often mix personal stories, emotion-driven visuals and intense themes," said Chai. "By bringing medical knowledge directly into the AI model, we can detect harmful content more reliably and surface it to human moderators when it matters most."
The model outperformed other state-of-the-art tools and revealed medically relevant themes that appear in videos linked to suicidal thought expressions. For platforms, this means automated systems can more accurately flag videos for follow-up by human reviewers, improving consistency and reducing the volume of content they must assess manually.
The authors note that the model is designed to support, not replace, human judgment. They emphasize that moderation teams should continue to make final decisions based on platform policies, legal standards and ethical considerations.
The findings offer practical guidance for platforms facing mounting scrutiny over teen safety and mental health harms. With lawsuits, regulatory pressure and rising public concern, the researchers say tools like theirs could help reduce preventable tragedies.
More information: Jiaheng Xie et al, Short-Form Videos and Mental Health: A Knowledge-Guided Neural Topic Model, Information Systems Research (2025). DOI: 10.1287/isre.2024.1071
Journal information: Information Systems Research Provided by Institute for Operations Research and the Management Sciences Citation: AI model predicts which short videos on major platforms could spark suicidal thoughts (2025, November 17) retrieved 17 November 2025 from https://techxplore.com/news/2025-11-ai-short-videos-major-platforms.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
Explore further
Social media sends mixed messages on food, study indicates
Feedback to editors
