Layout parasitics significantly impact the performance of analog integrated circuits, leading to discrepancies between schematic and post-layout performance and requiring several iterations to achieve design convergence. Prior work has accounted for parasitic effects during the initial design phase but relies on automated layout generation for estimating parasitics. In this work, we leverage recent developments in parasitic prediction using graph neural networks to eliminate the need for in-the-loop layout generation. We propose an improved surrogate performance model using parasitic graph embeddings from the pre-trained parasitic prediction network. We further leverage dropout as an efficient prediction of uncertainty for Bayesian optimization to automate transistor sizing. Experimental results demonstrate the proposed surrogate model has 20% better R2 prediction score and improves optimization convergence by 3.7 times and 2.1 times compared to conventional Gaussian process regression and neural network based Bayesian linear regression, respectively. Furthermore, the inclusion of parasitic prediction in the optimization loop could guarantee satisfaction of all design constraints, while schematic-only optimization fail numerous constraints if verified with parasitic estimations.
This material is posted here with permission of the IEEE. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to firstname.lastname@example.org.