DVCon US Keynote: Why Verification Must Evolve in the Convergence Era
At DVCon US 2026, a keynote delivered by three speakers—Abhi Kolpekwar (Siemens EDA), Jean-Marie Brunet (Siemens EDA), and Alon Shtepel (Micron)—shared a simple but important message: Verification is no longer a scaling problem. It’s a systems problem.
For years, the industry has successfully handled increasing design size. More transistors? Add more compute. More IP? Run more regressions. That model served us well.
But something changed. Despite all that scaling, first-silicon success continues to decline. So what’s really going on?
Complexity Has Shifted: From Size to Interaction
As Abhi Kolpekwar explained, modern systems aren’t just bigger. They are fundamentally different. They are:
- Modular
- Software-defined
- Highly configurable
And most importantly, their behavior emerges from interaction.
Hardware interacts with software. Power interacts with performance. Protocols interact with real workloads. This leads to an important realization: Individually correct components can still fail when combined.
This is where traditional verification assumptions begin to break down. The idea that “more tests equals more confidence” no longer holds when correctness depends on system-level behavior.
Verification Has Become a Feedback Loop
Another key takeaway is that verification is no longer a linear process. It is now a learning system. Each run does more than check correctness. It informs what to do next. Failures reshape priorities. Coverage redirects effort. Understanding builds over time. Much of the real work happens between runs:
- Interpreting results
- Debugging failures
- Deciding next steps
This is where teams feel the strain. Execution has scaled, but insight has not kept pace .
The Bottleneck Is Decision-Making
It is easy to assume that faster simulation or more compute will solve the problem. But the keynote made something clear:
Execution is rarely the true bottleneck.
The real challenge is managing iteration. Teams need to interpret results and steer verification effectively. This is where a new approach is emerging.
We need intelligence between the steps:
- Systems that observe results
- Systems that understand context
- Systems that help guide the next action
Generative AI can reduce manual setup. Agentic AI can coordinate workflows across tools. Together, they reduce friction and allow engineers to focus on higher-value decisions.
Insight Depends on Trustworthy Execution
Jean-Marie Brunet emphasized a key constraint. If execution does not reflect real system behavior, the resulting insight cannot be trusted.
Today’s systems depend on:
- Real software stacks
- Real workloads
- Power and performance interactions
If these elements are abstracted away, verification can miss critical issues. This is why faithful, connected execution is essential across simulation, emulation, and prototyping.
As systems scale, hardware-assisted verification becomes foundational rather than optional.
AI Is Accelerating Verification, with Engineers in Control
Alon Shtepel shared a practical perspective from real-world teams. AI is already helping with:
- Documentation and specifications
- Testbench and assertion generation
- Debug and analysis
But success is not just about adopting AI. It is about using it responsibly. Teams must:
- Validate results instead of blindly trusting them
- Maintain guardrails
- Keep engineers in control
The goal is not automation for its own sake. The goal is to amplify engineering judgment.
The Path Forward
Across all three talks, a clear direction emerged. Verification is becoming a closed-loop, adaptive system.
Shifting from:
- Fixed flows to adaptive workflows
- Isolated runs to continuous learning
- Manual effort to AI-assisted guidance
At the center of this evolution are engineers, now supported by better insight, faster iteration, and smarter tools.
Watch the Full Keynote
This post highlights just a few ideas from DVCon US 2026. To understand how verification is changing and what it means for your team, watch the full keynote: Beyond Bigger Designs: Rethinking Verification for the Era of Convergence
Bottom line:
Complexity did not just increase. It shifted.
Verification must evolve to keep up.
Comments
Leave a Reply
You must be logged in to post a comment.



Great insights on verification convergence. The shift toward unified verification methodology reminds me of similar convergence in fintech tools – for example, compound interest calculators are now integrating retirement planning features. Check out Coast FIRE Calculator for an example of converged financial planning.