Wednesday, January 28, 2009

Speeding up testing

There is an interesting article in GCN which looks at speeding up software testing in the US Department of Defence.:

One way to eliminate these barriers is to simplify the testing and certification process, and the governance that drives it, by bringing together developers, testers and users in an integrated development and testing environment. The Defense Information Systems Agency provides a good model for this with its Federated Development and Certification Environment.

Another approach is to build governance structures that increase trust, cooperation and standardization between different organisations that conduct testing and certification. Over the years, different organisations in DOD have developed their own sets of testing standards and processes, and many still don’t accept each other’s testing and certification results.

By standardising test-acceptance criteria, DOD can realise one of the biggest potential payoffs of SOA and related strategies: re-use.


Interesting that it has taken the emergence of Service Oriented Architectures to drive large government organisations away from their antiquated approach to testing and into the modern world. Hurrah for SOA.

Sunday, January 25, 2009

Static testing and doing the right thing

Read an interesting paper about static analysis and defect detention at Embedded.com.

Perhaps the most surprising thing about static analysis is not that it can detect memory leaks, buffer overflows, or incorrect pointer assignments at compile time, but rather that users of static analysis will often fail to fix such detected defects.

One of the most frequently misunderstood aspects of static analysis is that it is distinctly different from other bug finding techniques. Static analysis users are often aware of the advantages it provides in defect detection, while simultaneously failing to realise that a different approach must be taken for defect resolution.

Static analysis testing reports defects in a different way than dynamic testing and user reported defects; these differences must be understood and appreciated to effectively improve software quality with any static analysis tool.


If you're interested in learning how to take an effective approach with static analysis testing then take a look at the paper.

Software testing market

Sunday, January 18, 2009

Testing SOA

I came across an interesting article on testing SOA (Service Oriented Architecture) developments at Havemacwillblog. As well as looking at unit testing it looks at the peculiar issues of integration testing SOA applications:

In the pleasant but inefficient world of application silos, testing was relatively easy. You tested the silo. You unit tested the components of the application. Next you integration tested the whole. Then, if you had any qualms about performance, you stress tested the whole thing. Ultimately you gave it over for acceptance testing. Creating the test bed was not a problem.

The problem with SOA is that it’s end-to-end. The silos have gone and with them went the ability to easily set up a test bed. As you make your way down the SOA road, you will need to have much more versatile test-bed creation capabilities.

Stress testing/performance testing is also going to be a bit of a challenge as regards the test bed. Depending on how a new composite application is going to run, you will probably want the stress testing to be modeled around the whole set of end-to-end software that will quite probably span many different servers.

Labels: , , ,