File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Learning-Based Approximation of Interconnect Delay and Slew in Signoff Timing Tools

Author(s)
Kang, SeokhyeongKahng, Andrew B.Lee, HyeinNath, SiddharthaWadhwani, Jyoti
Issued Date
2013-06-02
DOI
10.1109/SLIP.2013.6681682
URI
https://scholarworks.unist.ac.kr/handle/201301/46772
Fulltext
https://ieeexplore.ieee.org/document/6681682
Citation
2013 ACM/IEEE International Workshop on System Level Interconnect Prediction, SLIP 2013
Abstract
Incremental static timing analysis (iSTA) is the backbone of iterative sizing and Vt-swapping heuristics for post-layout timing recovery and leakage power reduction. Performing such analysis through available interfaces of a signoff STA tool brings efficiency and functionality limitations. Thus, an internal iSTA tool must be built that matches the signoff STA tool. A key challenge is the matching of 'black-box' modeling of interconnect effects in the signoff tool, so as to match wire slew, wire delay, gate slew and gate delay on each arc of the timing graph. Previous moment-based analytical models for gate and wire slew and delay typically have large errors when compared to values from signoff STA tools. To mitigate the accumulation of these errors and preserve timing correlation, sizing tools must invoke the signoff STA tool frequently, thus incurring large runtime costs. In this work, we pursue a learning-based approach to fit analytical models of wire slew and delay to estimates from a signoff STA tool. These models can improve the accuracy of delay and slew estimations, such that the number of invocations of the signoff STA tool during sizing optimizations is significantly reduced.
Publisher
2013 ACM/IEEE International Workshop on System Level Interconnect Prediction, SLIP 2013

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.