RELIABILITY OPTIMIZATION IN ULSI VIA AI MODELS

Main Article Content

Swarnita Gorakshnath Kale
Rahul Ganpat Mapari

Abstract

Ultra-Large-Scale Integration (ULSI) technology has enabled the integration of billions of transistors on a single chip,
powering modern computing devices. However, as device dimensions shrink and complexity increases, reliability challenges such as
wear-out mechanisms, process variations, and transient faults become critical. This paper explores the application of Artificial
Intelligence (AI) models to optimize reliability in ULSI circuits. By leveraging machine learning and deep learning techniques, predictive
models for failure analysis, lifetime estimation, and fault detection are developed. The results demonstrate significant improvements in
reliability prediction accuracy and proactive optimization strategies, contributing to enhanced ULSI chip robustness and performance.

Article Details

How to Cite
Kale , S. G., & Mapari, R. G. (2024). RELIABILITY OPTIMIZATION IN ULSI VIA AI MODELS. International Journal of Advanced Scientific Research and Engineering Trends, 8(12), 28–30. Retrieved from https://journals.mriindia.com/index.php/ijasret/article/view/2012
Section
Articles

Similar Articles

<< < 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.