Sunday, June 24, 2018

Uncalled rant on testing metrics

Uncalled because we all already know this right? As great people such as Cem Kaner already told us about it a long time ago

Uncalled as I haven't used these in years.

Written because I was just asked by management to report product fault statistics.

Using any sort of defect/bug/fault count related metrics in sw development is harmful.

Bug counts fall apart already as they are trying to quantify something that is inherently qualitative. Additionally they make people focus on reporting problems rather than solving them. And really bug counts tell nothing about the product that involved people wouldn't already know.

The only thing good in bug statistics on sw development is that it gives test managers a very easy way to provide meaningful and professional looking but totally hollow metrics. 

And that is not good.

Using any sort of test case count related metric in sw development is harmful. 

Test case is not a specific thing. A test case can be good, bad, incomplete, false, useless, misleading, dishonest, etc. A test case may be small, large, expensive, cheap, etc. And no amount of test cases will ever be "all the test cases". Counting these together gives you a sum that means nothing about the coverage of the tests on product under test. 

A passing test may mean that the test, the tester, the system under test, or the circumstances in the test were wrong/right. A failing test means the same thing. Counting these means nothing on the quality of the product under test.  

The only good thing about test case counts is that it gives test managers a very easy way to give meaningful and professional looking measures on the progress of testing, that actually have no substance.

And that is not good.

Want to get information about the quality of your product and process?

Ask the developers, testers, sales people, customers, and end users. Investigate the root causes of problems. Hold retrospectives. Analyse logs and usage of the system.

Do the work. Don't settle for defect and test case counts just because those are easy.

Sunday, June 10, 2018

I don't report bugs

I don't report bugs. Bug is such a loaded word that people understand very differently, that instead of using it and explaining what I mean by it I rather just use other words. Like observations, thoughts, surprises, ideas, alternatives, or something similar. (And no I don't use fault, defect, or error either).

Bug has also quite a negative connotation. "Reporting a bug" is kind of like telling someone that they've been served. And as we are actually giving away the gift of information, why wrap it in such a nasty package?

And maybe more importantly it is very likely that whatever you might have to say is wrong. If not plain wrong, then at least incomplete. So I like to approach the kind of situations with the assumption that I am probably wrong. Cutting off anything that might sound arrogant makes stuff quite a lot easier. Especially after you realise later on that you have been wrong.

I leave plenty of observations unreported. I don't want to waste my or my colleagues time on stuff I believe will not create any actions. If I see no risk, I see no/little value, I just drop it. Someone might think this is irresponsible, I consider it professional.

I don't write reports. Writing a great report takes a lot of time, and it still very easily might get understood differently. So I always rather go talk to someone, or ask someone to come and take a look. Demoing a behaviour to someone is faster, enables better understanding, and also with other people you usually can find the possible root causes faster.

Still, Sometimes due to different reasons I still may write something. If I do I:
1. Keep it short. People will lose focus or not even read long texts
2. Tell what the observation is. So what happened, and why I think it was interesting. E.g. this happened, although I would've expected that.
3. Tell why. Why I think it is interesting. But briefly.
4. Give a bit details. ID, log snippet, link. Just enough to understand how the observation could be seen.

But will not write this into a backlog or defect tracking system. It's such a sad thing to see those beautiful bug reports getting buried into defect tracking systems or backlogs, forever to be forgotten. So rather into a chat, where someone might even comment on it.

If I can wrap all this into a failing automated test, I might. But then I often feel like I am spending a bit too long time on my own, missing the communication. Plus if I'd go this far why not just go and fix the problem myself or by pairing up with someone.

Can you imagine, I used to lecture about bug reporting, I used ti arrange competitions on who has the best bug reports. And now I won't even write them anymore.

And I love it.

Wednesday, May 30, 2018

10 things to help you suck less in prioritisation

Improvements in how things are being done don't help that much if you are doing the wrong things.

Focusing on cutting down the deployment/production pipeline, using the latest and greatest languages and tools, exploratory testing, mob programming, etc will surely be a boost to efficiency. But efficiency is not key if you are doing the wrong things.

And quite often we are.

And a big reason for that is, that we suck at prioritisation. We suck at it because we:
- spend too little time on it: "But we could save minutes of talking by hours of coding!"
- do it too rarely: "Welcome to our annual roadmap revision meeting."
- try to have specific people/roles be responsible for it: "Ask the PO..."
- do not think about different dimensions enough: "But the customer needs it!"

But mainly we suck at it because it is so hard.

Here tho is list of 10 things I think might help.

1. Don't keep a big backlog. Focus on the things being done now, and on the few things to do next. Forget the rest.
2. Do not rank things with labels, instead just rank them in an order. We all have seen too many priority1 projects..
3. It's ok and good to have visions and high level plans for longer times. But don't put them on a "roadmap" and then forcefully execute that.
4. Avoid long and seldom happening prioritisation meetings. Instead prioritise often and ad-hoc.
5. The value is in the idea, not in who presents the idea. Let the ideas compete, not the people.
6. Do not only consider the cost to build, but also the cost of delay (how much do we lose or not gain while this is not done, opportunity cost (what else could we be doing instead of this), cost to maintain, etc
7. Do not only consider the value to customer but also team motivation & wellbeing, code and system infrastructure simplicity, brand, support, relationships between stakeholders, etc
8. Involve everyone to prioritisation. It's hard. It's messy. It's important.
9. Try to get everyone to understand why we decide what we decide. Not everyone needs to agree, but understanding is very important.
10. Look back at the good&bad decisions. What helped you to select the right thing back then? Why the hell did we end up doing that?

Why ten? It sounds nice, fits into a board made of stone, and because this post was very late already due to some bad prioritisation.

I'll try to do better next week..

Sunday, May 13, 2018

Six reasons why testers should do code reviews

I have had quite a lot of discussions about code reviews. Quite many also with testers, by which I have understood that many do not do those.

I will not start arguing here on whether code reviews are good/important or not. But I will list a few things why I think testers would benefit of doing them.

1. Code is the only documentation that is up to date. If you really want to know how something really functions, you want to be able to see and read the code.

2. Knowing more about the thing done enables you to do better testing. You can spot things that you should definitely test, and things that you probably don't need to test that much. Like extra things added by coder, usage&modifications of existing functions, data types, etc.

And to arguments thinking that one loses their "independence" as a tester by knowing too much, I would worry a lot less about that than about testing stuff that you have no idea on how it has been built.

3. Improve logging. I obsess on logging. I am always asking to add more, more details, and in correct levels. Logs can help a ton when testing. Logs will be indispensable when tracking down those problems in production. Logs can help you analyse the usage, and non-usage of your system to spot further improvements.

If you are a tester and you are not obsessed on logs. Why?

4. Add testability. Logs are kind of a part of this already. But knowing how stuff has been built, can help you to ask for more control on some parts, or more visibility on others.

5. Read the unit tests. In my experience this is the most overlooked place when developers are doing the code reviews. And tests are something that testers should be pretty good at... Some examples be suggesting new tests, deleting of unnecessary tests, better input variables, and stronger assertions.

6. You might spot some other things too! Asking on unclear parts may reveal unnecessarily complex code OR you might learn something. Variable names can often be better. And occasionally one might even spot some functional issues too. But I would suggest to be quite careful when offering the feedback. Asking why something was done in this way is usually better, than telling how you think it should have been done.

Not doing them and still not convinced? It might not be for everybody, but based on my own experiences I do suggest at least trying it out a few times.

No access to the source control system? Ask and thou shall be given.

Can't read the code? Read it anyway. You will learn, and get better at it. And reading good code is a lot easier than writing it. Kind of like reading good books is easier than writing them.

Other team members think it is not useful for testers to do code reviews? Hard to think that this would be an actual issue.. But if it would be, ask them to read this post and to leave a comment telling why. If they will and I cannot counter comment - shame on me. If they won't - have a good time doing those code reviews!

Sunday, April 29, 2018

10x tools #2: Clipboard history

Back with  the 10x tools journey! 

Last time I talked about the Kipling method, and this time it is turn for the tool why I wanted to do the 10x talk in the first place. The tool is so simple, yet so useful that I really would not want to work anymore with a computer not having this tool.  

The tool is, tat ta da daa, clipboard history. 

You know how mint windows & mac operating system's clipboard works. Copy something to clipboard, and it is there. Copy something else to clipboard and the previous record is lost. And that is really sucky. As a result of this, you might end up going back and forth two documents copy pasting, or have a separate place to paste intermediate stuff. Or sometimes you might accidentally copy something new and lose the previous item from the clipboard. 

So clipboard history saves you on those cases. But after using it for a while, I've started to use it for other things as well. For example when I see something I think I might need later on, I'll just copy it. If I am editing something that is not autosaved like a web form, I'll take backups every now and then with the good ol ctrl/cmd+c. And if I am deleting something, I don't. I'll cut instead and have it in my clipboard just in case. And then when I need to find "that one uuid" I used yesterday, I'll just go and find it from my clipboard. 

I've been doing more and more programming too, and copy paste is very common there as well. At least when I do it :)

So you really want one of this tools. 

These days I use a Mac, so my tool is currently Flycut - which is not that good as it has quite short memory (91 items) and it only stores text. Alfred costs a bit but will give you longer history and also images in the history + a lot of other useful things I've heard. So I should upgrade to that. 

The only thing I miss from the Windows world (on top of the forced updates and daily reboots :D ) is Ditto, which will give you even history of files, a great search, and great usability. Man I reeeally liked that tool.

So that's it, number 2 of the 10x tester tools. And it's a life changer. 

Don't believe me? Just check these user reviews of flycut from Mac appstore (try to read them in a shopping tv channel voice):
"The only think I can say about this app is as a programmer and a power user, this app has changed my life. "

"Indispensible. I must use this a dozens of times a day - can’t imagine doing without."

So why wait, get it today for free!