Here's another chat I have had many times. It starts with the golden and notorious question of "how much should you test". And it then drifts a bit from there.
********
How much should you test?
"Well, you know, I used to work once in this team doing a medical device where releasing frequently was not really possible due to the many constraints. And even doing a small release would cost a lot. And it being a class C (death or serious injury may occur) device all issues found on a released product would have been a pretty big deal. So this meant that we took unbugginess very seriously, and thus that I had a lot of time to spend testing the device.
And boy did I.
I spent sometimes almost like the entire week just in front of the machine testing, thinking of ways how it could fail, unrolling all the test techniques I got in me, learning the ins and outs of the product. And then one day when the new release was done.... several problems were found. So taken into account that we really bled our heart out for this and still problems were found, I think you shouldn't really test too much as you anyway won't catch all problems.
And then again sometimes especially these days we release stuff without spending close to no time in testing and no problems are found. But with this often I feel that without the testing, you miss out on the chance to learn, to get ideas, to make suggestions, and to give feedback to the developers and business. So you shouldn't really test too little, because it ain't only about catching bugs.
Then again if you consider testing is "done" after you have deployed to production, then think again! In production you have a beautiful chance to investigate if & how the new stuff is being used and if it is working as expected. And be ready to react if/when something unexpected happens. Testing in production is often at least as (if not more) important as the testing you do before the release. And if you can't see or affect to what happens in production then there's something missing from the implementation, like proper logging, pilots, toggles, communication with customers, etc."
So what about the significance of test automation? That must play a role too on how much you should test? And what about risk? And what ab..
"yea yea I know. Of course it depends on the context, on the level of automated tests, and on the risk of the change. It always depends. Everything always depends. But you get involved, you do what is needed at the given time, you make mistakes, you try to do better next time. And you will figure it out."
How about being involved early then? Like testing requirements?
"Well.. I don't really like to talk about "testing requirements". Sounds like reading thru excels trying to find sentences that don't have the word shall... But of course you want to be involved on all parts of the product development. Starting from discussing why should we do this thing, what impacts would we want to achieve. Then discussing and working together on what do we do, and how do we do it. Pair or mob on the code, do code reviews, talk with the users and the whole lot. I think doing stuff together from the start is in the long run so much more effective. And so much more fun!"
Aren't you afraid of loosing the objectiveness then when testing something so familiar?
"Not really. I am more afraid about testing something I do not understand. Or of testing irrelevant things because of not knowing the implementation. I am also a lot more afraid of solo work than not being objective. And anyway, I think you can still be objective while testing by switching your approach while exploring. E.g. by running through different scenarios. And it is really great to do some mob testing here too."
What do you call yourself these days? A tester? A QA? A developer?
No one answers better than the prisoner: https://www.youtube.com/watch?v=d0LaT6qVRpg :)
Comments
Post a Comment