این سایت در حال حاضر پشتیبانی نمی شود و امکان دارد داده های نشریات بروز نباشند
Journal of Artificial Intelligence and Data Mining، جلد ۱۳، شماره ۳، صفحات ۳۵۹-۳۶۸

عنوان فارسی
چکیده فارسی مقاله
کلیدواژه‌های فارسی مقاله

عنوان انگلیسی Attention Mechanisms in Transformers: A General Survey
چکیده انگلیسی مقاله The attention mechanisms have significantly advanced the field of machine learning and deep learning across various domains, including natural language processing, computer vision, and multimodal systems. This paper presents a comprehensive survey of attention mechanisms in Transformer architectures, emphasizing their evolution, design variants, and domain-specific applications in NLP, computer vision, and multimodal learning. We categorize attention types by their goals like efficiency, scalability, and interpretability, and provide a comparative analysis of their strengths, limitations, and suitable use cases. This survey also addresses the lack of visual intuitions, offering a clearer taxonomy and discussion of hybrid approaches, such as sparse-hierarchical combinations. In addition to foundational mechanisms, we highlight hybrid approaches, theoretical underpinnings, and practical trade-offs. The paper identifies current challenges in computation, robustness, and transparency, offering a structured classification and proposing future directions. By comparing state-of-the-art techniques, this survey aims to guide researchers in selecting and designing attention mechanisms best suited for specific AI applications, ultimately fostering the development of more efficient, interpretable, and adaptable Transformer-based models.
کلیدواژه‌های انگلیسی مقاله Attention mechanism,Transformer,deep learning

نویسندگان مقاله Rasoul Hosseinzadeh |
Department of Computer Engineering, Science and Research SR.C., Islamic Azad University, Tehran, Iran.

Mahdi Sadeghzadeh |
Department of Computer Engineering, Science and Research SR.C., Islamic Azad University, Tehran, Iran.


نشانی اینترنتی https://jad.shahroodut.ac.ir/article_3521_7a48fc3c8b98a9c2ffeba1a3e4dfafa4.pdf
فایل مقاله فایلی برای مقاله ذخیره نشده است
کد مقاله (doi)
زبان مقاله منتشر شده en
موضوعات مقاله منتشر شده
نوع مقاله منتشر شده
برگشت به: صفحه اول پایگاه   |   نسخه مرتبط   |   نشریه مرتبط   |   فهرست نشریات