Automated testing is a craft. It requires skill, training, and at least a bit of talent. I guess it’s like everything else in that sense. But it’s unlike everything else in that you are doing automated testing. Not a lot of people do that.
There are way more people that writing code or do testing than do both. Even including the two fringes of writing unit “tests” and record and playback “testing”. I think (somewhat vainly) that the ability (and inclination) to automate testing on a practical, functional level (in a useful way) is an unusual commodity.
Or maybe I’m just not that talented and have chosen the career path equivalent of joining the water-polo team. (Notice that I restrained myself from saying “lacrosse” or “soccer”.)
As a result, I think we tend to get into our silos, and each have to reinvent not only the wheel, but the concept of propulsion. We scrape together bits and pieces from the internet, and occasionalyl find published gems like Beautiful Testing or Java Power Tools. If we’re lucky, we make a few contacts with peers through blogs, local user groups, or conferences and share ideas. If we’re really lucky we get to work with and learn from a few insightful co-workers. But let’s face it, test automation isn’t exactly good water-cooler fare.
Beautiful testing is a great book (that I hope to read someday soon). I’d like to see something focused, not abstract, and relevant. I don’t want to see tutorials or best practices, but individual experience so others can say “I do that too” or “I didn’t think of that.”
I want it to be focused, so that there can be enough common information to discuss details. On the other hand, I’d like to see discussions about the pros and cons of specific tools (Selenium vs. Watir) and techniques (browser driven vs. browser simulated) and plunge into the details of design patterns (Page Objects, Helpers) and organization strategies (suite grouping, continuous integration, test environments).
I guess what I’d really like to see are case studies, real or hypothetical, that tell why certain decisions were made, how they were implemented, and what lessons were learned. I could see either focusing specifically on web application functional testing with Selenium, or covering a variety of platforms (iPhone, web services, AJAX) and tools (Selenium, Webrat, Cucumber, SOAPUI, etc.) I suppose a certain variety would be needed to discuss the pros and cons of each decision.
I’m not looking for rights and wrongs, “best practices”, just individual solutions (I guess I’m repeating myself). Anyway, I’d envision something like 10 essays, with a discussion afterward among the authors, and then of course, an online forum where comments from all comers could be given.
I’d be interested in hearing from QA “celebrities”, authors, and trainers, including (but not limited to):
Tool creators (a few of many):
as well as anonymous testers in their silos who maybe haven’t written a word about what they’re doing.
Aaron,
I am the only tester at my company, and I am working with a team of 12 developers. I can honestly say that I am the only person on the team who takes any joy from writing test automation, and as a result, find myself in a skills/knowledge silo as you mention.
I would be interested in taking part in some kind of test tools case study or comparison. I would certainly like to come up with some common testing problems, and discuss how best they could be handled with the different products available.
BTW – in the rest of the world outside the U.S.A., joining the “soccer” team would be the path to popularity/glory.
I know about soccer, just a personal dig at some friends who are into soccer here, and it is getting more popular these days in the USA.