Friday, April 29, 2011

Requirements are long lasting

I have working with large systems and maintenance for a long time, typically within the Enterprise Application domain. I have worked in projects where the code should be "self documenting" and projects that follow RUP diligently. The first one became unstable after a while because of unclear requirements. There was too much code inspection to explain the functional owners what the system did, and there was a problem finding out if features where a bug or not. Also re-factoring was hell. The second one we spent too much time not writing code, and the system really did not reflect the formal requirements. It was just too much documentation to put up with. By experience I think that there must be an information model (and some Glossary or Concept descriptions), description of behavior (eg. Use Cases), and sequence or activity description tying this together. I have also worked with TDD (or similar where tests are written in pair with the code) and seen what a great effect this has on good code modularization.

I have no experience with user stories, but i see and hear (together with BDD and TDD) that it is a good approach to requirements handling that fits the backlog and suitable for the sprints. And together with a suite of functional test, it serves a solid combination for the project delivering an application. My concern is that developing the system is only approx. 10% of the lifetime cost (or less). During the lifetime there will be different system owners and developers and they need to understand how the system should work. There will be reviews and overall planning later on that need to understand the system without looking into the code. In such a context the user stories seem to fragmented and seems more like a changelog for the development, rather than a description of the system.

For the last 90% of the lifetime there is a need for a robust requirements base. How do we ensure that this is maintained? I do believe (from experience), that the requirements must be described when it is developed and be aligned with the version is released. At that time I also think that the functional parts should be a whole more as defined by good Use Case practice. Regardlessly of how you wish to document the requirements, at least kept them in a common and persistent store (not post-it notes), so that there is a stable base for future reference.

And also; it is hard to keep such documentation in place, it certainly requires diligence, obligation and thoroughness to keep up.
In a development team I believe that someone must be responsible for the overall functional package, and that is is not just up to each developer to write user stories. All to often i have seen that functionality is written again without reusing or extending what has already been developed.
Creative Commons License
Requirements are long lasting by Tormod Varhaugvik is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Thursday, April 14, 2011

New tools, old methods. A masons story

We got all these new tools, but we must also change the way our developers and architects solve problems with it. We got to understand new tools and methods to actually make better software systems. Eg. Classes with 6000 lines, bad separation of concern between layers or components, no clue of MVC, no interfaces, etc.

As an example from constructing a house:
You have descided to construct a house in wood. All your requirements point in that direction. But all you got are masons. How do you think that house would look like without the proper training (or on-site inspection of an existing wooden house)?

A house in wood is constructed very differently than a house in bricks. You first build a frame with beams, then insulate and cover with boards.  The mason would probably lay every bean horizontaly with mortar in between. It would surely be a costly house, and would have none of the qualities we originaly specified this to have. Also the masons would think you where an idiot.
Creative Commons License
New tools, old methods. A masons story by Tormod Varhaugvik is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Tuesday, April 12, 2011

Java is mature. The software-teens are over.

Just a few words on why we would expect less expensive maintenance and better quality in our software systems by using Java and following the standards (de-facto or formal). And also on why I think Java is mature and will not go away for some time.

I would like to compare with the construction of things. Until well into the industrial revolution, every part of something (eg. a clock) was made by hand. And every craftsman had his own approach to this, and it was labor intensive. Screws for example had all sorts of screw threads with different angles, lead, pitch, diameter and handedness. This also meant that if something broke, you had to get that exact copy, probably from the same craftsman (or his son...), no other screw would do.

The same goes with construction of houses. Well into the 50´s there was no obvious standard. Every window in a house was tailor made, they looked alike but had small deviations due to optimizing the usage of materials. The size of a beam was constructed from the "feeling" of the carpenter and the available resources. Standardization has brought many good things into play; training of carpenters, estimation of cost, material and labor, lower prizes due to mass-production; everything is built in 60cm sections.

Now what has this to do with software? Most of today's legacy systems (all the way from the early cuts on software development) where built in a time or with a technology where everything had to be made from scratch. There where virtually no components and there where no good technology for using components. Also training was bad, Computer Science was in its infancy. Every programmer mastered its own discipline, and strange arguments could be heard in the corridors; should arrays start with 0 or 1; big-endian or not, new line before or after comma? These "great" discussions are really a waste of time, the real value is not there. It is in standardization and doing things so that others can do maintenance and understand what the system does. But as with the screws and windows; doing is learning, is was the software-teens we had to get through.

These legacy systems work, but only after a close study, and they will always be hard to maintain, just because everything is tailor-made. If it was a building it would either be condemned, or it would be protected by the antiquarian.

Recently we had a PoC where we found that validating an xml-file (by streaming through it) was just some lines of code. Various legacy versions is much larger. Which do you think gives the most cost-efficient maintenance and ease of change? Which is most robust as to change in who does maintenance on the system? Which one is faster?
SchemaFactory schemaFactory =
Schema schema = schemaFactory.newSchema(new File(schemaFile));
Validator validator = schema.newValidator();
validator.validate(new StreamSource(new BufferedInputStream(new

I think after approx 60 years of software development and now 15 years of Java (that was 45 years without), we see a very mature language, a great set of standardized technologies, a vast number of components and systems, a large community and supplier base, and a resilient run-time architecture.

I can't see that we could make a better choice.

(We are seeing new efforts into more efficient languages, but for now they are niche players. If a new language should replace Java, it would probably take 10-15 years of maturing before it could actually be there. Just think of all the work put into the Virtual Machine... And as I have argued; the real value is in how we actually construct software systems, not the language by itself)
Creative Commons License
Java is mature. The software-teens are over by Tormod Varhaugvik is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.