RELIABILITY OPTIMIZATION IN ULSI VIA AI MODELS
Main Article Content
Abstract
Ultra-Large-Scale Integration (ULSI) technology has enabled the integration of billions of transistors on a single chip,
powering modern computing devices. However, as device dimensions shrink and complexity increases, reliability challenges such as
wear-out mechanisms, process variations, and transient faults become critical. This paper explores the application of Artificial
Intelligence (AI) models to optimize reliability in ULSI circuits. By leveraging machine learning and deep learning techniques, predictive
models for failure analysis, lifetime estimation, and fault detection are developed. The results demonstrate significant improvements in
reliability prediction accuracy and proactive optimization strategies, contributing to enhanced ULSI chip robustness and performance.
Article Details

This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.