Recent Advances in Brain MRI Image Classification for Cancer Detection Using Transformer and Group Parallel Axial Attention with Quantum Self-Attention: A Systematic Review
Main Article Content
Abstract
Brain tumor detection using Magnetic Resonance Imaging (MRI) is a critical component of modern neuro-oncology, enabling early diagnosis and effective treatment planning. Traditional machine learning approaches often struggle with complex tumor structures and variability in imaging modalities, limiting their clinical applicability. Recent advancements in deep learning, particularly Transformer-based architectures, group parallel axial attention mechanisms, and emerging quantum self-attention models, have significantly improved classification accuracy and efficiency.
Transformer models utilize self-attention mechanisms to capture long-range dependencies in MRI images, achieving superior performance compared to conventional Convolutional Neural Networks (CNNs). Axial attention further enhances efficiency by decomposing attention operations across spatial dimensions, reducing computational complexity while preserving contextual information. Additionally, quantum self-attention introduces a novel paradigm by integrating quantum computing principles into deep learning frameworks, enabling enhanced feature representation and optimization.
This systematic review analyzes studies from 2020 to 2023, highlighting key trends, including hybrid CNN-Transformer architectures, attention optimization strategies, and explainable AI integration. Comparative analysis reveals that hybrid and attention-based models achieve classification accuracy exceeding 99%. Despite these advancements, challenges such as data scarcity, computational complexity, and interpretability persist. Future research should focus on lightweight models, multi-modal learning, and quantum-enhanced architectures for improved clinical deployment.
Downloads
Article Details

This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.