Chrome Extension
WeChat Mini Program
Use on ChatGLM

DeepBlues@LT-EDI-ACL2022: Depression Level Detection Modelling Through Domain Specific BERT and Short Text Depression Classifiers.

PROCEEDINGS OF THE SECOND WORKSHOP ON LANGUAGE TECHNOLOGY FOR EQUALITY, DIVERSITY AND INCLUSION (LTEDI 2022)(2022)

Cited 0|Views13
No score
Abstract
We discuss a variety of approaches for building a robust depression level detection model from longer social media posts (e.g., Reddit depression forum posts) using a mental health text informed pre-trained BERT model. Further, we report our experimental results based on a strategy to select excerpts from long text and then fine-tune the BERT model to combat the issue of memory constraints while processing such texts. We show that, with domain specific BERT, we can achieve reasonable accuracy with fixed text size (in this case 200 tokens). In addition we can use short text classifiers to extract relevant text from the long text and achieve some accuracy improvement, albeit, trading off with the processing time for extracting such excerpts.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined