Apropos of a recent discussion on the software-testing mail list, I was reminded of reading James Bach's Agile Test Automation presentation for the first time. It contains this one great page that says that test automation is:
any use of tools to support testing.
until that time I had held the idea that test automation was closely related to manual tests. I was familiar with data-driven and keyword-driven test frameworks, and I was familiar with tests that relied on coded parts to run, but I still had this idea that there was a necessary connection between manual tests and automated tests. I had this idea that proper automated testing was simply where the machine took over and emulated human actions that were expensive or boring.
That way, of course, lies madness. And expense.
It was reading this particular presentation that really lit the light bulb and got me thinking about all the other things that testers do, and all kinds of other ways to approach test automation. Here are some things I've done as a test automator since I read Bach's presentation that bear no resemblance to a test case or a test plan (in no particular order):
2nd party test server was mostly down, so I wrote a sockets-based server to impersonate it.
Script to collect and organize output for gigantic number of reconciliation reports.
Script to parse compiler logs for errors and warnings.
Script to deploy machine images over the network for test environments.
Linux-based file storage-retrieval-display system in all-Windows shop.
Script to parse entire code base and report all strings shown to users. (So humans could find typos.)
Script to reach into requirements-management-tool database and present requirements in sane form.
Various network traffic-sniffing scripts to get a close look at test data in the wild.
Script to compare file structure on different directories.
Script to compare table structure in different databases.
Script to emulate large numbers of clicks with interesting HTTP headers on the GET.
Scripts to install the nightly build.
Monkey-test scripts to click randomly on windows.
and here's more testing-like substance I did as a test automator, that was still mostly not about validation:
Emulated GUI layer to talk to and test underlying server code.
Gold-master big-bang test automation framework.
SOAP tests framework.
Watir scripts for really boring regression tests. (But I didn't emulate the user, instead I made the code easy to maintain and the output easy to read.)
And lots of other odd bits of reporting, manipulation, chores and interesting ideas.
I have a couple of conclusions from a few years of working like this. Or maybe they're just opinions.
1) Excellent testers should be able to address the filesystem, the network, and the database, as well as the UI.
2) Testing is a species of software development. Testers and devs are closer than they think.
3) Testing is a species of system administration, too.
4) Testing is a species of customer support, also.
any use of tools to support testing.
until that time I had held the idea that test automation was closely related to manual tests. I was familiar with data-driven and keyword-driven test frameworks, and I was familiar with tests that relied on coded parts to run, but I still had this idea that there was a necessary connection between manual tests and automated tests. I had this idea that proper automated testing was simply where the machine took over and emulated human actions that were expensive or boring.
That way, of course, lies madness. And expense.
It was reading this particular presentation that really lit the light bulb and got me thinking about all the other things that testers do, and all kinds of other ways to approach test automation. Here are some things I've done as a test automator since I read Bach's presentation that bear no resemblance to a test case or a test plan (in no particular order):
2nd party test server was mostly down, so I wrote a sockets-based server to impersonate it.
Script to collect and organize output for gigantic number of reconciliation reports.
Script to parse compiler logs for errors and warnings.
Script to deploy machine images over the network for test environments.
Linux-based file storage-retrieval-display system in all-Windows shop.
Script to parse entire code base and report all strings shown to users. (So humans could find typos.)
Script to reach into requirements-management-tool database and present requirements in sane form.
Various network traffic-sniffing scripts to get a close look at test data in the wild.
Script to compare file structure on different directories.
Script to compare table structure in different databases.
Script to emulate large numbers of clicks with interesting HTTP headers on the GET.
Scripts to install the nightly build.
Monkey-test scripts to click randomly on windows.
and here's more testing-like substance I did as a test automator, that was still mostly not about validation:
Emulated GUI layer to talk to and test underlying server code.
Gold-master big-bang test automation framework.
SOAP tests framework.
Watir scripts for really boring regression tests. (But I didn't emulate the user, instead I made the code easy to maintain and the output easy to read.)
And lots of other odd bits of reporting, manipulation, chores and interesting ideas.
I have a couple of conclusions from a few years of working like this. Or maybe they're just opinions.
1) Excellent testers should be able to address the filesystem, the network, and the database, as well as the UI.
2) Testing is a species of software development. Testers and devs are closer than they think.
3) Testing is a species of system administration, too.
4) Testing is a species of customer support, also.