Advancements in Neural Architecture Search for Automated Model Design

Main Article Content

Dipannita Mondal
Sheetal S. Patil

Abstract

Neural Architecture Search (NAS) has emerged as a transformative approach to automating the design of deep learning models, significantly reducing human effort and expertise in model architecture engineering. This paper reviews recent advancements in NAS techniques, including differentiable search methods, reinforcement learning-based approaches, and evolutionary algorithms. We explore the impact of these methods on model efficiency, scalability, and accuracy across various tasks such as image classification, natural language processing, and reinforcement learning. Furthermore, we discuss the integration of hardware-aware optimization strategies that balance model complexity with real-world deployment constraints. The convergence of NAS with self-supervised learning and foundation models is examined, highlighting a paradigm shift toward generalized and automated AI systems. Despite its progress, challenges remain, including high computational costs, limited generalizability, and the trade-off between exploration and exploitation in search strategies. We conclude by outlining future research directions, emphasizing the need for sustainable and interpretable NAS frameworks that democratize access to state-of-the-art AI models across diverse applications.

Downloads

Download data is not yet available.

Article Details

How to Cite
Mondal , D., & Patil, S. S. (2025). Advancements in Neural Architecture Search for Automated Model Design. International Journal of Recent Advances in Engineering and Technology, 13(1), 1–5. Retrieved from https://journals.mriindia.com/index.php/ijraet/article/view/54
Section
Articles

Similar Articles

1 2 3 4 5 6 7 8 > >> 

You may also start an advanced similarity search for this article.