Skip to main content

Testing Web Services

There was a recent thread on the agile-testing mail list about testing Web Services. I ended up having an interesting private conversation with the original poster, during which I said:

I would immediately ask you "how are these services implemented?" Assume that your web services are implemented as pure XML-over-HTTP. I'm building a system-test framework in Ruby right now that understands how to talk to and how to listen to these XML-over-HTTP interfaces. I could just as well be using SOAP or REST, but in this case I don't have to.

> > Any "Gotchas" I need to be aware of ? Special strategies / approaches /

I think it boils down to two questions: the choice of tools, and the
choice of approach. Almost any set of tools can work with either approach. Do you know Brian Marick's distinction between "business-facing" tests and "technology-facing" tests?

The choice of approach I identified is to test with unit-like tests
that validate whether the service is performing properly internally;
or to take a more black-box approach that validates whether the
information that the service processes is correct on the way in and on the way out.

The tools choice is whether to use unit-test sort of tools highly
coupled to the code itself like jUnit or NUnit, or to use a more
foreign framework to uncouple the tests from the implementation of the services.

I've chosen to implement a foreign test framework in Ruby that concentrates on correctness of information moving in and out of the service. I did this for two reasons: I'm working with great developers who can be relied upon to have
implemented (and tested!) the service correctly; and I'm testing against an idiosyncratic 30-year-old legacy database, so unusual data conditions are more likely to cause failures than are incorrect services. It's a matter of adjusting to risk.

The bit above assumes that we all agree on the definition of a "Web Service", which is in fact probably unlikely. I'm stipulating that a well-known interface implemented as an XML payload in an HTTP POST operation is in fact a web service, even in the absence of WSDLs, SOAP, REST, or any of those other goodies. Even if it's false, it makes a good argument.

I also assume that we all agree on the purpose of our web services, which is slightly more likely-- is the Web Service on the Internet, or are is it for applications behind a firewall?

I'm thinking of these services in terms of an Enterprise Application Integration system which could implement any number of interfaces in addition to my (broadly-defined) Web Services interfaces. This is another reason why a robust general-purpose scripting language like Ruby is an attractive choice for building the tests: besides HTTP, my tests will also have to speak ODBC, FTP, and talk to the filesystem, among other EAI interfaces.

There'll probably be more on this subject later, but for those interested in designing system tests for such services and architectures, these are required reading.


Popular posts from this blog

Reviewing "Context Driven Approach to Automation in Testing"

I recently had occasion to read the "Context Driven Approach to Automation in Testing". As a professional software tester with extensive experience in test automation at the user interface (both UI and API) for the last decade or more for organizations such as Thoughtworks, Wikipedia, Salesforce, and others, I found it a nostalgic mixture of FUD (Fear, Uncertainty, Doubt), propaganda, ignorance and obfuscation. 

It was weirdly nostalgic for me: take away the obfuscatory modern propaganda terminology and it could be an artifact directly out of the test automation landscape circa 1998 when vendors, in the absence of any competition, foisted broken tools like WinRunner and SilkTest on gullible customers, when Open Source was exotic, when the World Wide Web was novel. Times have changed since 1998, but the CDT approach to test automation has not changed with it. I'd like to point out the deficiencies in this document as a warning to people who might be tempted to take it se…

Watir is What You Use Instead When Local Conditions Make Automated Browser Testing Otherwise Difficult.

I spent last weekend in Toronto talking to Titus Fortner, Jeff "Cheezy" Morgan, Bret Pettichord, and a number of other experts involved with the Watir project. There are a few things you should know:

The primary audience and target user group for Watir is people who use programming languages other than Ruby, and also people who do little or no programming at all. Let's say that again:

The most important audience for Watir is not Ruby programmers 
Let's talk about "local conditions":

it may be that the language in which you work does not support Selenium
I have been involved with Watir since the very beginning, but I started using modern Watir with the Wikimedia Foundation to test Wikipedia software. The main language of Wikipedia is PHP, in which Selenium is not fully supported, and in which automated testing in general is difficult. Watir/Ruby was a great choice to do browser testing.  At the time we started the project, there were no selenium bindings for …

Open letter to the Association for Software Testing

To the Association for Software Testing:

Considering the discussion in the software testing community with regard to my blog post "Test is a Ghetto", I ask the Board of the AST  to release a statement regarding the relationship of the AST with Keith Klain and Per Scholas, particularly in regard to the lawsuit for fraud filed by Doran Jones (PDF download link) .

The AST has a Code of Ethics  and I also ask the AST Board to release a public statement on whether the AST would consider creating an Ethics Committee similar to, or as a part of the recently created Committee on Standards and Professional Practices.

The yearly election for the Board of the AST happens in just a few weeks, and I hope that the candidates for the Board and the voting members of the Association for Software Testing will consider these requests with the gravity they deserve.