Thursday, November 17, 2011

AlterSite multi-browser test support

SmartBear Software (testing tool supplier of QAcomplete and TestComplete) have enhanced AlertSite, their web and mobile performance monitoring solution, to support testing on IE and Firefox browsers.

"Organisations can now test, monitor and improve how their website is performing across both browsers to ensure customers are getting an optimal user experience, no matter their browser preference.

The AlertSite built-into-the-browser transaction recorder captures and plays back user events to measure the end-user experience exactly as it unfolds in the browser. Now users can record any multi-step transaction or click stream and play it back on multiple browsers to simulate user experience. Unlike other Web monitoring tools that require separate recording processes for each browser, AlertSite users can simply record once and monitor from both Firefox and Internet Explorer across multiple locations."

Tuesday, November 15, 2011

Soasta and Selenium

An recent article in Dr Dobbs identified that Soasta has added Selenium-based functional testing capabilities to its branded CloudTest platform. This extends their open-source testing tool by providing test recording functionality and other features such as the ability to test in distributed environments and new analytics.

Thursday, October 27, 2011

The Art of Software Testing (revisited)

Only a few days now till the third edition of the Art of Software Testing is available. Publishers Wiley give a date of 8th November 2011.

The classic guide to software testing, updated

Since the first edition of The Art of Software Testing, the hardware and software of computing have changed dramatically.Yet this book has stood the test of time. Whereas most books on software testing target particular development techniques,languages, or testing methods, The Art of Software Testing,Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches.

The new Third Edition applies the original classic principles to today's hot topics, including:testing apps for iPhones, iPads, BlackBerry smartphones, Android phones and tablets, and other mobile devices; collaborative (user) programming and testing for internet applications, e-commerce, and agile programming environments

Students and IT managers alike will find The Art of Software Testing, Third Edition an indispensable resource that pays for itself with the first bug you discover and fix.

Monday, May 09, 2011

Topics for the International Conference on Practical Software Quality and Testing

  • Testing websites and e-Commerce applications
  • Agile approaches for testing
  • Static testing: reviews, Inspections
  • Defect tracking and studies
  • Load, stress testing
  • Testing technology, process and automation
  • Integration, systems and regression testing
  • Use Cases and testing
  • Successful tool usage
  • Security testing
  • Risk-based testing
  • Certified Test Manager (CTM)
  • Testable requirements
  • Testing related topics
  • Managing test function and processes
  • Test Effort estimation
  • Developing, implementing standards
  • Risk management and mitigation
  • End-to-End testing
  • Developing, managing a test team
  • Requirements management, modeling
  • Certified Software Test Professional (CSTP)

Tuesday, May 03, 2011

The Art of Software Testing

Long term readers of this blog may remember previous entries that referred to Glenford Myers, author of "The art of software testing", such as FAQ No. 8 and FAQ No 11 or even this entry on The art of software security testing.

Well I have just learned that the 3rd edition of The Art of Software Testing is due out towards the back end of 2011 (I think October). The new edition includes some new material covering
"how to apply the book's classic principles to today's hot topics including:
  • Testing apps for iPhones, iPads, BlackBerrys, Androids, and other mobile devices
  • Collaborative (user) programming and testing
  • Testing for Internet applications, e-commerce, and agile programming environments"
Roll on October.

Tuesday, April 26, 2011

Mobile device testing

If you've ever wondered about some of the problems with testing mobile applications development, then you might want to check out this article. It contains eight lessons for developers of mobile applications.Some may seem self evident (like "Expect Users to make mistakes") but its amazing how easy it is to overlook them in the heat of development.

My favourite is No 1 - Focus on User Experience. Its clear that smartphones have become mobile computing devices and are increasingly used like this. What is often overlooked is that the smaller screen size especially the small screens of some smartphones like the Blackberry mean that the applications can't be laid out as they are for web applications.

Testing the user experience (often called the customer experience for m-commerce applications) is key to ensuring that the mobile application is a success. And as the different mobile devices are running different operating platforms this means testing on a range of mobile devices.

Monday, April 25, 2011

Test performance

Amongst all the positive noise about virtualization, I noticed what appeared to be a negative article here. It advised that you approach virtualization with caution. The underlying concern was that performance was not considered appropriately. And what did it recommend to address this? Why, performance testing of course.
"Performance Testing and resource consumption needs to be conducted on a few servers or workstations with a proper backup for redundancy and failure. Setup tools need to be built in order to monitor and correct performance issues."
The article goes on to fragmentation issues in the virtual environment and bottlenecks. The latter is of course another reason for performance testing. So on closer inspection, it is not so much an anti virtualization article but rather a pro performance testing article. Positive after all.




Saturday, April 23, 2011

Scalability testing

Reading over some of the old entries on this blog, I found an interesting article from way back in 2005 on scalability testing. It looks at 7 steps to running effective scalability tests. What surprised me most was not that the steps are still valid all this time later. What is odd is that it talks about SOA in performance testing. This shows how long it has been around. And the issue of performance dependability on external services remains as problematic today as in 2005.

Monday, April 18, 2011

Spike test

Spike testing is an interesting form of performance testing. According to WIkipedia:
"Spike testing, as the name suggests is done by spiking the number of users and understanding the behaviour of the application; whether performance will suffer, the application will fail, or it will be able to handle dramatic changes in load."
I know you shouldn't use the term you are defining as part of the defintion (in this case spiking in the defintion of spike testing) but if Wikipedia is doing it, who would dare suggest this! The reason that spike testing is interesting is not to do with this defintion though. It is now commonplace for a huge number of people to visit a website when it receives media coverage or is part of a viral exercise on the web. And these huge volumes can take out a website because nobody expected such volumes and therefore didn't plan or test what would happen if they came.

Given the amount invested in web applications it is surprising that the website owners don't conduct what can be a relatively straight-forward type of load test. Maybe this is an area where a little more education on the costs of conducting high volume spike tests against the risks of not doing these performance tests would be really beneficial.

Labels: ,

Saturday, November 28, 2009

Automated testing and sins

It's not often you see an article on software testing where the format makes it stand out, but that is the case on this article on automated software testing. An example of how it treats the sin of Pride is below.

Picture the scene: The wool has been pulled over the eyes of Brimstone Business Application Co’s CIO. He did a deal with a big IT supplier to have their quality-centred products added to his order. At the time it seemed like a good idea – he saved the company money and had a nice round of golf rolled-in to boot. He’d fallen into the superiority trap of believing that the most expensive or most prevalent solution would always be the best, but now was beginning to realise that this particular technology was not actually compatible with his company’s needs. He’d brought this solution in, so his pride was unable to take failure. Instead he persevered until it was too late and placed unrealistic goals on his QA team, who were then forced to revert back to manual testing. As a result the project time-lines slipped, applications went out the door late and bug-ridden, which proved expensive in re-work costs and built up a huge stack of technical debt. His department was now damned to an eternity of fire-fighting the latest problems.

The punishment in Hell will be: to be broken on the wheel.

Avoidance strategy: bigger isn’t always better. Look around when evaluating new solutions.

Sunday, November 15, 2009

VSTS testing - Visual Studio Team System Test Edition

If you are using Visual Studio Team System Test Edition for testing there is plenty of advice on the Microsoft site to help new testers and advanced practioners. For a start point try out Getting Started with Team System Testing Tools

This section takes you on a tour through the tools and windows of Microsoft Visual Studio 2005 Team Edition for Software Testers. You will see basic aspects of the Team System testing tools, such as how to create and work with tests, the types of tests that are available, and how to configure the testing tools.

For a range of different testing information on VSTS try the testing type information:
Working with Unit Tests Provides links to topics that describe unit tests and how to create them. Working with Web Tests Describes how to create, edit, run, and view Web tests. Working with Load Tests Describes the uses of load tests, how to edit and run them, how to collect and store load test performance data, and how to analyze load test runs. Working with Manual Tests Describes how to create and run manual tests, the only non-automated test type. Working with Generic Tests Describes how to create and run generic tests. Generic tests wrap external programs and tests that were not originally developed for use in the Team System testing tools. Working with Ordered Tests Describes how to create ordered tests, which contain other tests that are meant to be run in a specified order
Microsoft Visual Studio VSTS testing

Sunday, September 27, 2009

Software Planner newsletter

If your looking for an overview of test cases for testing the user interface you could start with the Software Planner newsletters from Steve Miller. For the last two news letters he has focused on this area. In August it was "20 Useful test cases for testing User Interfaces" and this month it is "15 Useful test cases for ensuring a consistent User Interface".

Examples of the test cases include:

Screen font type: Ensure that the screen font family matches from screen to screen. Mismatching fonts within the same sentence and overuse of different fonts can detract from the professionalism of your software user interface.

Screen font sizes: Ensure that the screen font sizes match from screen to screen. A good user interface will have an accompanying style guide that explicitly defines the font type and size for headers, body text, footers, etc.

Error logging: If fatal errors occur as users use your application, ensure that your applications writes those errors to a log file, event viewer or a database table for later review. Log the routine the error was in, the person logged on, and the date/time of the error.

Error messages: Ensure that error messages are informative, grammatically correct, and not condescending.

Tuesday, September 22, 2009

Outsourced testing webinar

There is a Free Webinar on Thursday 24 September: 14.00 BST. Its entitled "Outsourced Testing – You Can Have Your Cake and Eat It".

"There are now many organisations offering managed testing and QA services, with the promise of in-depth technical and testing skills, cost savings, flexibility, accelerated delivery, and better quality.
While these contracts are entered into with the best intentions, all too often these seemingly great deals soon begin to feel expensive, inflexible, and no longer targeted to the needs of the organisation. Control of how well solutions are being tested, and the continuity of value delivery, must essentially be maintained.
Anyone considering or using outsourced testing services will want to avoid the pitfalls experienced by others via proven methods of management and working. This webinar explores some practical management techniques for setting up, or rejuvenating testing managed service arrangements, including supplier management and measurement, and tools that make outsourcing testing work a success for the customer."

Sunday, September 20, 2009

Software Planner acquired by AutomatedQA

This week saw the merger of AutomatedQA with Pragmatic Software. Both companies had good testing tools in complementary spaces. TestComplete serviced the automated functional testing space while Software Planner provided testing tool support for test management throughout the SDLC (Software Development Life Cycle). The two companies had already begun to work together to provide a more integrated solution and compete with the wider testing tool offerings from companies such as HP (the Mercury testing tools) and Micro Focus (both the Compuware and the Borland (nee Segue) testing tool set).

"For too long, Application Lifecycle Management and automated testing has been out of reach for all but the biggest software development organizations", says Derek Langone, President of AutomatedQA. "The addition of the Software Planner product suite lowers the barriers to entry of cost and complexity, so every software developer, regardless of size, can leverage the immediate advantages of ALM".

Steve Miller, President/CEO of Pragmatic Software joins AutomatedQA as Vice President of ALM Solutions. "We are excited to have Software Planner join the AutomatedQA family of outstanding quality assurance products," says Miller. "Integrating manual testing and test automation into a comprehensive software lifecycle solution will allow our clients to succeed with their software development efforts while empowering teams to become more lean and productive."

Sunday, August 30, 2009

Testing DOS (Denial Of Service)

If you're looking for a testing service which simulates a denial of service attack then you could try the one identified here:

Parabon, a provider of extreme-scale grid computing, is now offering to public sector organisations and private sector companies a risk-free, no-obligation "red team" test attack to help them assess the reliability of critical network assets in the face of distributed denial of service (DDoS) attacks.

The offer features a grid-based load and performance testing application that can replicate a full-scale DDoS attack. This offer is Parabon’s response to the rising incidence of DDoS attacks against U.S. websites.

It's not clear if this service extends to non US websites (such as UK web-enabled applications and websites) but it could be worth contacting them to find out.

Sunday, August 23, 2009

ALM or SDLC

Is ALM replacing SDLC in marketing terms? Someone asked me that the other day and I must admit I don't know the answer. I have seen quite a few press releases recently in which testing tools are placed in the ALM context rather than SDLC.

For example, on August 17th Sys-con had an article about Seapine Software testing tools (here) under the headline Seapine Software Announces TestTrack 2010. The introduction was:

"Seapine Software, a leading provider of global quality-centric application lifecycle management (ALM) solutions, today announced the release of TestTrack 2010. This software suite delivers new features, functionality, and enhancements to the popular TestTrack Pro and TestTrack TCM products, and introduces TestTrack RM, Seapine’s new requirements management solution that manages the complete requirement lifecycle, including planning, workflow, traceability, review, change management, and reporting. All three tools seamlessly integrate with one another to provide end-to-end traceability of requirements, issues, and tests."

ALM got a couple more mentions later on the article but no SDLC was anywhere to be seen.

And on August 19th EWeek carried an article on Thoughtworks tools under the headline ThoughtWorks Brings Agile to Application Lifecycle Management with Adaptive ALM

"ThoughtWorks Studios, a provider of Agile application lifecycle management solutions and software development tools, announces Adaptive ALM, a new Agile solution for enterprise developers building ALM systems.

ThoughtWorks Studios, a provider of Agile application lifecycle management solutions and software development tools, on Aug. 17 announced Adaptive ALM, a new Agile solution for enterprise developers building ALM systems."

A stack of mentions in a few sentences. And there were plenty more in the rest of the article. So on first assessment it looks a like this is the case. But two cases is not conclusive evidence so more examples are needed before a conclusion can be drawn.

Labels:

Wednesday, July 29, 2009

Testing as a Service for contact centers

I have just been reading about a new service from Empirix called “Testing as a Service”. As the name suggests its a testing service, but more specifically it’s a Quality Assurance (QA) solution for contact centers (I think these are what the good old call centre used to be) installing new plaforms or upgrading/refreshing existing technology.

The Empirix Testing as a Service, is scalable: it allows contact centres of all sizes (even to large contact centers) to determine the quality of the customer experience as well as the overall performance to ensure the ROI for a contact center. The testing service is highly adaptable and covers the complete contact center infrastructure.

Testing as a Service uses Empirix’s Hammer Test Engine., for both in and out-of-service testing and in-service monitoring basis. There is also a secure, real time reporting function to monitor the testing activities of Empirix’s Testing as a Service.

For more information you can visit Empirix

Tuesday, July 28, 2009

Testing center of excellence

The Center of Excellence (which, as we have noted in the past, has now replaced Centre of Excellence) is a concept which has been around for some time. It has gone in and out of favour as organisations have changed and adapted to different approaches for building and delivery IT.

HP software have been using the term to cover their QA and testing offerings. HP Performance Center is performance testing software aimed at supporting a performance testing center of excellence (COE). Qualtiy Center is a test management tool for managing a QA Centre of Excellence. You can get an HP white paper on Building and Managing a Quality CoE

There is also the HP appliction security center. And Paladion have announced that they have built an Application Security Testing Centre of Excellence around this HP tool, in India. It combines an IT infrastructure with experienced security testers and best practices in using HP application security center. Its purpose it to locate and fix security vulnerabilities in computer software applications throughout the full Software Development Life Cycle (SDLC).

Monday, July 27, 2009

Acceptance Test Engineering Guide

Microsoft have released a draft version of a guide on Acceptance Testing. As well as covering the testing aspects of the acceptance process it covers the collection of data in order to making a decision during the acceptance activities. It also covers preparation for acceptance testing and the approaches to acceptance testing: specifically whether you do it in a separate stage (an acecptance testing phase) or throughout the full Software Development Lifecycle (SDLC). The latter they call Incremental Acceptance Testing.

You can download a copy of the Acceptance Test Engineering Guide and provide feedback on the same website in the discussion forum.

Sunday, July 26, 2009

Software testing books

Looking at software testing books, I came across some interesting lists at SoftwareQATest.com. As well as a list on software testing books list there were lists of automated testing books, QA books and a host of other IT topics like project management and risk management. But back to the software testing list. The top five books in the list, which are the ones most recommended by the site, had a mini review. For those interested in the choices they are:

  1. Lessons Learned in Software Testing, by C. Kaner, J. Bach, and B. Pettichord (2001)
  2. Testing Computer Software, by C. Kaner, J. Falk, and H. Nguyen (1999)
  3. Perfect Software and Other Illusions About Testing, by G. Weinberg (2008)
  4. How to Break Web Software, by M. Andrews and J. Whittaker (2006)
  5. Testing Applications on the Web, by H. Nguyen, R. Johnson, and M. Hackett (2003)

If you want to see the reviews, you can visit the site. Or better still get the book and make up your own mind.