Research Article

Improving Emotion Recognition in Social Media through Multimodal Data Fusion Technique

by  Pawan, R.K. Sharma
journal cover
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 187 - Issue 64
Published: December 2025
Authors: Pawan, R.K. Sharma
10.5120/ijca2025926083
PDF

Pawan, R.K. Sharma . Improving Emotion Recognition in Social Media through Multimodal Data Fusion Technique. International Journal of Computer Applications. 187, 64 (December 2025), 37-41. DOI=10.5120/ijca2025926083

                        @article{ 10.5120/ijca2025926083,
                        author  = { Pawan,R.K. Sharma },
                        title   = { Improving Emotion Recognition in Social Media through Multimodal Data Fusion Technique },
                        journal = { International Journal of Computer Applications },
                        year    = { 2025 },
                        volume  = { 187 },
                        number  = { 64 },
                        pages   = { 37-41 },
                        doi     = { 10.5120/ijca2025926083 },
                        publisher = { Foundation of Computer Science (FCS), NY, USA }
                        }
                        %0 Journal Article
                        %D 2025
                        %A Pawan
                        %A R.K. Sharma
                        %T Improving Emotion Recognition in Social Media through Multimodal Data Fusion Technique%T 
                        %J International Journal of Computer Applications
                        %V 187
                        %N 64
                        %P 37-41
                        %R 10.5120/ijca2025926083
                        %I Foundation of Computer Science (FCS), NY, USA
Abstract

Emotion recognition in social media is a challenging task due to noisy, informal, and highly contextual text. This study presents an improved BERT-based framework for classifying six primary emotions—Happy, Sad, Angry, Fear, Surprise, and Neutral. The methodology has been enhanced through rigorous preprocessing, contextual tokenization, class-imbalance handling using random oversampling, and optimized fine-tuning of BERT. A comprehensive experimental setup was employed, including detailed evaluation metrics, confusion matrix analysis, and performance comparison across varying training configurations. High-resolution figures and expanded result interpretations provide deeper insight into model behavior, particularly for minority classes. The proposed approach demonstrates strong performance on social-media datasets and establishes a foundation for future multimodal fusion techniques involving text, emojis, and visual cues.

References
  • J. Devlin et al., “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” NAACL-HLT, 2019.
  • A. Vaswani et al., “Attention is All You Need,” NeurIPS, 2017.
  • Q. Li et al., “Emotion recognition in social media based on multi-task BERT fine-tuning,” IEEE Access, 2021.
  • Q. Li et al., “Fine-tuning BERT for multi-class emotion classification in tweets,” Information Processing & Management, 2022.
  • Y. Sun et al., “EmoFusion: A multimodal framework for emotion classification in social media,” Information Fusion, 2023.
  • A. Chakraborty et al., “Handling data imbalance in emotion classification using resampling techniques,” IEEE Trans. Affective Computing, 2021.
  • S. Alharthi et al., “Sentiment analysis and emotion classification on social media: A survey,” Journal of King Saud University, 2021.
  • D. Ghosal et al., “EmotionX: Multimodal emotion recognition using ensemble deep models,” Information Fusion, 2021.
  • Y. Zhang et al., “Context-aware emotion recognition from tweets using transformers,” Computational Intelligence, 2022.
  • R. Chatterjee et al., “Multimodal sentiment and emotion analysis in social media: A survey,” ACM TOMM, 2023.
  • M. Singh et al., “Real-time emotion detection using deep learning,” Neural Computing and Applications, 2023.
  • L. Wang et al., “BERT-based ensemble learning for emotion recognition,” IEEE Access, 2023.
  • L. Wang et al., “Context-aware multimodal emotion recognition,” IEEE Trans. Affective Computing, 2024.
  • Z. Yang and E. Hovy, “Fine-tuning large models for emotion detection,” Journal of AI Research, 2022.
  • Y. Zhao et al., “Sentiment and emotion detection from short texts: A survey,” Information Processing & Management, 2022.
Index Terms
Computer Science
Information Sciences
No index terms available.
Keywords

Emotion Recognition BERT Multimodal Data Fusion Social Media Analysis Class Imbalance Machine Learning Exploratory Data Analysis

Powered by PhDFocusTM