Sunday, March 4, 2018
Testers doing test automation - is that the most important thing for you to do right now?
I've been thinking quite a lot about tester's moving to do test automation. Lately beause of these three things:
1. European testing conference, a great testing conference I attended couple of weeks ago. It is very cool due to many things; the way the speakers get compensated, the focus on the conferring side of conferences making it very easy for people to discuss stuff, the way the talks are chosen, and because there are also a lot of developers joining. So anyway when I was talking with several of the attendees, it was a bit strange how it was easier for me to talk about the product development as a whole with the developers, where as with the testers it more naturally moved into talk of automation and tools. Also on the open space, I think the majority of the topics pitched was automation or tool related. And quite little on the process or on the customer facing side of product development.
2. In my company there are a lot of different products and product teams, and in an attempt to share some knowledge between the teams there are different guilds. Like devops guild, architecture guild, API guild, and a testing guild. So usually when the testing guild meets there are mainly testers from different teams participating, and a topic is introduced by someone with a little bit of followup discussion. And the three of the last four topics introduced have been quite automation/tool centric. (And the non automatic centric one was my topic)
3. This week I read a linkedin post from an old colleague titled Runnable Specifications are here, never speak about test cases anymore. In it he made several good arguments against the-step-by-step-testcase-execution by a tester and offered the role of executable specification writer in return. So kind of shifting left and doing automation.
So I don't have anything against test automation (well I do make a case against stupid automation but anything stupid is stupid so that don't really count), but I do have a little trouble understanding why so many testers think that's what they should be doing now. I mean, I do not try to argue that anybody should write and maintain test cases and then "execute" them manually. But that the only alternative to this is for the tester to automate those test cases? I don't buy it.
First of all, I'd say that most commonly there are a lot less "testers" than there are "programmers" on a product development team. And I'd say that commonly the people identified as "testers" are not as good in programming as the ones who have identified as "programmers". So I think the heavy lifting of test automation (which is also programming) should be done by people who know programming. Because they can, you know, program. And if they do test automation they might make the app more testable in the first place. And they also know how stuff works so that they can automate the tests faster. And if the same people work on the code and the test automation it ain't as likely to create bottlenecks and missing automation. And if it is ok already that programmers handle the unit and often also the integration level test writing - what is so different on the end to end level? Testing expertise comes in handy here of course, as testers might be better at providing stronger assertions, better test data, better test scenarios, etc. So teaming up might be a really nice idea (it almost always is ;) ). But who-does-what should be based on motivation and skills - not based on some label.
Secondly, when thinking of the biggest problems any of the products I have ever worked with, those would not have been prevented with more test automation. Many things might have been easier, sure. And many bugs would have been caught earlier, I believe. But the biggest problems? No. And this leads to a crazy statement.
There is more to product development than programming and testing.
I don't want to diss programming. It is an art. An art I ain't really good at (at least yet). But there's other stuff. There are alternatives to brainless test execution by person, or making the brainless test execution to be done automatically. I'll mention a few
Product management. This is what makes or breaks the product. A good idea done shitty, might give people value. But a shitty idea done in the most beautiful way with beautiful automated tests is still a shitty idea. It is like an ass made of silver - it's shiny but it's useless. So this is a place where we would need more emphasis! More brains to engage on making sure we pick the right stuff to do, that we focus on impacts, that we focus on not doing too much, that we pick the right level to prototype, that we do not overcommit too early, that we do stuff that is of value. And this is not the responsibility of a "product manager". It is the responsibility of everyone in the product team! And this requires work.
Stakeholder involvement and communication. If I would pick one thing that has caused most problems in all the stuff that I have been working on, it has been the lack of communication. Within team yes, but especially between the team and it's various stakeholders. We need people that not only consider what stakeholders need and want, but people that actually do the work of discussing, asking, asking again, demoing, listening, telling, managing expectations, with the stakeholders. Sales, marketing, support, customers, end users, integration partners, and who ever it is that has an interest on your product. They need to be heard, they need to know what's happening. And the team needs to do this. And this requires work.
Planning. We need people to make sure we are planning ahead - not too much! But not too little. Making sure we have an idea on the vision, that we think we know the next couple of steps, and a bit of the risks and alternatives. And that the team has always a shared understanding of what we are going to do. And creating this ain't easy. It requires work.
Process. It is easy to settle, continuous improvement is not. It is easy to agree to stupid procedures given, it is not easy to explain why you need to do differently. It is easy to believe that everyone is happy, it is hard to know if they are. It is easy to create silos, it is hard to break them. And it is easy to fallback to routine, it is hard to really start doing something in a different way. This requires work.
Monitoring and analysis. What is your definition of done? Acceptance tests pass? Documented and released to prod? That is not done, that is the start. Actually following what is happening in production, analyzing usage, digging for problems, thinking of improvement and making sure improvements get done is a super important part of product development. And it requires work.
Exploratory testing. Being there constantly looking for problems, looking for alternative ideas and things we did not think of, providing constant feedback to the team, and learning the whole product. It. Requires. Work.
And a lot of other stuff.
Again, I don't want to say that (test) automation is not important. Hell, I would like to do more of programming myself too, because it is fun! And I guess it is easier to add that to employment adds and CVs. But there are also other things that a product development team needs to do.
So the next time you start writing those automated tests, I want you to think - is this the most important thing to do right now.