Aspect-Based Sentiment Analysis (ABSA) enables fine-grained sentiment classification by associating opinions with specific aspects of an entity. Traditional approaches often require extensive labeled datasets and struggle with domain adaptation, multilingual contexts, and linguistic complexities, such as sarcasm and negation. In this paper, we present a transformer-based zero-shot learning framework for ABSA, applied to hotel reviews. Our approach leverages pre-trained large language models (LLMs), enabling aspect-level sentiment inference without task-specific fine-tuning. Extensive experiments on multilingual hotel review datasets in English, Arabic, German, and Spanish demonstrate that our zero-shot method consistently outperforms lexicon-based, machine learning (ML), and fine-tuned deep learning (DL) baselines across multiple evaluation metrics. Ablation studies confirm the importance of each pipeline component, and multilingual evaluations show strong generalization across languages without labeled training data. Statistical analysis further validates the model's stability and significant performance gains. This work advances scalable, domain-adaptable ABSA solutions and highlights the practical potential of zero-shot learning for multilingual sentiment analysis in real-world industry settings.
Aspect-Based Sentiment Analysis Using Transformer-Based Zero-Shot Learning for Hotel Reviews
- Details
- Written by Saed Alqaraleh, Amani Kanti, Hasan ALHUSEIN
- Category: Computer Science
- Hits: 64