I’ve written extensively about agile development from the point of a verification engineer. From the beginning, I’ve been firmly of the opinion that increased agility in development teams starts with the team itself. A team critiques its approach, adopts new practices (most likely from Extreme Programming or Scrum), refines and perfects those practices, critiques again, adopts new practices, refine, perfect, and so on in a continuous cycle of improvement. This is the most practical method for building agile, responsive development teams that I’ve seen recommended in software; also the method I subscribe to for hardware teams.
From the start, I’ve been skeptical of the role tool providers would play – even could play, if I’m being honest – in encouraging a continuous cycle of improvement that pulls from agile development. My view from outside EDA was that a reliance on tools and our industry’s expectations of tools holds us back at times. Hi-tech tool solutions to low-tech problems have been a source of frustration for me. EDA isn’t completely to blame for that, but EDA sells the tools so there were definitely times you could have pointed to me as an EDA skeptic.
Along that line, I expected joining Mentor would mean accepting a few irreconcilable differences between my own view of what agile development teams could be with the tool-centric view of my new employer. It made me nervous, for sure. But 8 months in, I’m happy to announce I was nervous for nothing. Within Mentor I’ve found a genuine interest in enabling agile development. More importantly, that interest grows out of customer interactions. It’s customers driving adoption of agile development practices and Mentor responding with supporting tools; exactly how I’d hoped.
One practice I’m referring to is called incremental development. I find it fundamental to agile development. We’ll get to the specifics of how Mentor supports incremental development in a follow-up. For the rest of this post I’ll give some explanation of incremental development itself, talk about how it depends on us accepting a new definition of done then sum it all up with a few general comments on Questa Verification Run Manager (VRM) and how we can change our scheduling habits.
One of four values listed in the agile manifesto is working software over comprehensive documentation. To the agile software community, this has been embraced through the idea that the only reliable indicator of progress is working software. A feature being Done means its been coded, tested, works as intended and could be put in front of a customer. Just being documented doesn’t count; being coded doesn’t really matter either. Done is measured against the software, nothing else. Is a feature deliverable? If ‘yes’, it’s done; if ‘no’, it isn’t.
In my former life as a verification engineer I used a similarly strict definition of done. I took the approach that the only reliable indicator of progress is a passing test. I went that way because from a functional point-of-view passing tests are the closest thing we have to working hardware. If a test passes, some corresponding feature is done; if not, it isn’t. The documentation or code I’d written didn’t matter; objective test results, that was it.
With that strict definition of done, we move to incremental development. Incremental development is an approach where entire products are split into feature sets. Each feature set has a corresponding list of tests. Teams focus on development and verification one set at a time. When the tests from one set all pass, the team moves onto the next. Progress looks like a staircase, each completed feature set is an incremental step closer to a finished product. Each step takes about a month or less – I prefer “or less” – and is polished to customer level quality.
A strict definition of done and incremental development are relatively simple ideas but with huge implications. The first is that emphasis on regression setup and results starts much earlier, effectively day 1, because passing tests are the only way to show you’ve accomplished anything. Further, we focus on incrementally developing only what’s required to support each feature set so we can run tests on each feature set as quickly as possible and without superfluous code getting in the way. Incremental development is the alternative to big bang development, the practice of creating entire designs and testbenches with test writing/running to follow.
A much larger implication is – and this is exciting for team leads, project managers and anyone who hates fudging numbers in weekly status reports – the obsolescence of task-based scheduling.
Yes, I know how that sounds. But stay with me here…
The task-based schedules we use to track development progress are highly ambiguous and terribly inaccurate. This is especially true during active design and testbench development when we depend on documentation/code written as our primary metric. Inevitably, everything runs smoothly up to 90% done, then it stays at 90% for months as we work out the kinks. I know they feel logical, but practice breaks our logic every time. The clarity task-based schedules provide into development progress usually ranks somewhere between translucent and opaque. They fail us repeatedly and it’s time to move on.
Which brings us to Questa VRM and the role it should be playing in front-end development. Cutting to the chase: test plans are the new schedule, regression statistics are the new progress and it’s all organized by VRM. If you have 100 tests and 50 of them pass, you’re half done. 90 tests passing? Congrats, you’re actually 90% done this time. No tests passing? Practically speaking, you haven’t started, regardless of what you’ve done. The tasks you’ve completed don’t matter anymore, only the tests run and the results collected by VRM.
LIke I said, we’ll get to the details in a follow-up. But to summarize so far, if you’re looking to increase your productivity and add objectivity to your development approach by building the responsiveness of an agile development team…
- Create an objective definition of done
- Capture a test plan early in your development cycle
- Develop and test your way through the design incrementally
- Run regressions and capture feedback with VRM
- Use VRM reports to view progress and predict delivery dates
Finally, if you use MS Project or some comparable project scheduling tool, it’s time to consider cashing that in for the real-time visibility and predictability Questa VRM provides. Next post, we’ll talk more about what that would look like in practice.
PS: If you’re in the Boston area this Friday, Oct 18th, stop by DVClub to hear more on regression management in agile teams from Mentor’s Tom Fitzpatrick!