Sunny Sunday morning. Everyone still dead asleep. Perfect time to pick up what I had started months ago: my HIRS@M series. Sorry for keeping you waiting.
In this post I want to shed light on the purpose of reviewing specs. Reasons for which I value a review; reasons often overlooked IMHO.
So this is #1. What's next? #2 and #3! Promised. However no promise when.
Let me give a sneak preview:
As I implicitly pointed out in the prologue of this series good speccing is about driving quality upstream. Good speccing is about making sure you are designing what is needed. Not a Pontiac Aztek, when a Roll Royce is wanted. Or indeed a Pontiac Aztek, if that is what's asked for.
There are many actions, many skills, many requisites that make up and define good speccing and one of them is spec reviewing.
BTW: if you are looking for the full Monty on specs I would like to recommend you Karl Wiegers' Software Requirements.
As with many disciplines speccing is a team effort and actually I would say: should be. For this reason the multidisciplinary development teams I worked in at MS were involved in getting specs in place right from the start. Whereas the program manager(s) (PM) would primarily be gathering requirements and writing the spec, dev, test and, if available, UA (i.e. user assistance being the documentalist) first contribution would be reviewing this. This setup would facilitate a multi-disciplinary, and as such a multi-perspective view on the matter. And thus, as Steve McConell states it in CODE Complete:
"Better code quality arises from multiple sets of eyes ..."
To me it's as plain as that: multiple sets of eyes looking at the same. You often even do not need to be a an expert on the topic to be able to make a valuable contribution to a spec review. Well skilled spec writers do often enough experience that their initial perspective was too narrow! Missing a wide enough range of perspectives/disciplines in spec review potentially inflicts problems down stream. Or as Wiegers points it out with respect to involvement of the customer:
"… we know that lack of customer involvement greatly increases the risk of building the wrong product"
At the end of the day it's all about cost. Every hour spend on reviewing pays off:
"Several companies have avoided as much as 10 hours of labor for every hour they invested in inspecting requirements documents ..."
From multiple sources McConnell calculates the relative costs for fixing defects based on "when they're introduced and detected":
Or in words:
"In general, the principle is to find an error as close as possible to the time at which it was introduced. The longer the defect stays …, the more damage it cause further down the chain"
The numbers depend of course on the type of software development business you're in. When building a solution specifically for a very small group of customers defects introduced at an early stage will not have as dramatic (cost) consequences as in case of building a standard solution for hundreds or thousands of customers, like Dynamics NAV.
IMHO it's not only the multitude of perspectives during reviewing that helps to drive quality upstream. It's also the simple fact of getting a team engaged. In general, being part of a process from the start creates bigger involvement, as people will be in the know and feel and know they can make a difference, and thereby set to contribute to the quality of the product. You might say this is true even when the team members concerned do not (always) actively made contributions to the spec review. They where there, they got the opportunity.
It's clear that "everyone owns quality" as How We Test Software at Microsoft states (page 392), so you better make sure your team is in the best position to carry this weight.
Fortunately for all Dynamics NAV pro's, some years ago, PACKT Publishing stepped into the hole in NAV documentation that had been there for years. With their last book, written by my fellow country man Mark Brummel, another and persistent gap was filled: the one left between Implementing and Programming Microsoft Dynamics NAV 2009. On this Mark has done a tremendous job "to cramp so much information in so little space ", as kriki states it.
With this review, however, I have not set out to give a complete review on all topics covered by the book, but simply want to highlight those that struck my eye.
Clearly Mark's adage "Look, Learn, Love" applies to his book too. It undoubtedly adds a new perspective to NAV documentation which, even for celebrated NAV pros, present enough facts to make the book worthwhile reading. Mark's long running history with NAV is guarantee to that, even though, I have to confess, I had to get used to his English phrasing. But it surely did not stop me from reading!
One of the things for which I will be frequently referring to this book in the future are the schemas. It has been a public secret for ages that one of NAV's major flaws in (technical) NAV documentation was the schematic representation of the various functional areas in NAV, i.e. the (almost) total absence of it. On this subject alone the books has filled a major gap.
The numerous meaningful and very useful (code) examples should also be mentioned; to be learned from by any newbie developer, to be used by experienced ones.
One of the few off-topics, and I honestly hadn't expect anything about it, nevertheless is "Testing". As Mark himself writes (p 443) "Testing is probably one of the most important but under-rated tasks of application design". But then only spends less than a page on it. Another gap to be filled by PACKT?
Altogether BAD is great,
I guess now little over a week after returning from my summer vacation I am settled down in the normal mode again. "Work and get paid?", you're thinking.
Well, yes, you're right, but that's not what I meant. Just simply getting back on writing a new blog item. Start thinking about it wasn't hard as during my absence my blog apparently got bombarded by a significant load of Russian comments. To be frank: I just treated them as spam as I unfortunately am not equipped with any understanding of Russian whatsoever.You know the load wasn't that abnormal, however that these all were in Russian (or at least that's what I think they were) was new.
Nevertheless these comments did not trigger me to write this post. It was actually today's statistics of my blog. Have a look.
Since my last post on Wed, Jul 14 2010 the number of hits had settled down to, say, about 70 hits a day with a bit lower numbers on the weekends. Busy as usual. But see what happened today! I hadn't written a new post and all of a sudden ... BANG ... "sky high".
No Russian comments I can tell you. I could not even trace any hits on any of my posts today ...
Any of you bloggers experienced the same?