Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT

Published

Author(s)

Apostol Vassilev, Honglan Jin, Munawar Hasan

Abstract

Detecting out of policy speech (OOPS) content is important but difficult. While machine learning is a powerful tool to tackle this challenging task, it is hard to break the performance ceiling due to factors like quantity and quality limitations on training data and inconsistencies in OOPS definition and data labeling. To realize the full potential of available limited resources, we propose a meta learning technique (MLT) that combines individual models built with different text representations. We analytically show that the resulting technique is numerically stable and produces reasonable combining weights. We combine the MLT with a threshold-moving (TM) technique to further improve the performance of the combined predictor on highly-imbalanced in-distribution and out-of-distribution datasets. We also provide computational results to show the statistically significant advantages of the proposed MLT approach.
Citation
arXiv
Volume
2310
Issue
15019

Keywords

Natural language processing · Out of policy speech detection· Meta learning · Deep learning · Large Language Models

Citation

Vassilev, A. , Jin, H. and Hasan, M. (2023), META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT, arXiv, [online], https://doi.org/10.48550/arXiv.2310.15019, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=956374, https://arxiv.org/ (Accessed November 21, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created October 23, 2023, Updated October 25, 2023