AssertLLM: Generating and Evaluating Hardware Verification Assertions from Design Specifications via Multi-LLMs
CoRR(2024)
摘要
Assertion-based verification (ABV) is a critical method for ensuring design
circuits comply with their architectural specifications, which are typically
described in natural language. This process often requires significant
interpretation by engineers to convert these specifications into functional
verification assertions. Existing methods for generating assertions from
natural language specifications are limited to sentences extracted by
engineers, discouraging the practical application. In this work, we present
AssertLLM, an automatic assertion generation framework for complete
specification files. AssertLLM breaks down the complex task into three phases,
incorporating three customized Large Language Models (LLMs) for extracting
structural specifications, mapping signal definitions, and generating
assertions. Additionally, we provide an open-source benchmark for assessing
assertion generation capabilities. Our evaluation of AssertLLM on a full
design, encompassing 23 signals, demonstrates that 89
assertions are both syntactically and functionally accurate.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要