Saturday, 28 April 2018

Testing Tunes 2. Embrace the jazz hands.

Okay this is the second post in this odd series. I have a more testing specific post knocking about in my head but I'm trying to figure out how to trim the ALL CAPS ANGER out of it. Whilst I'm doing that, this arrived partly formed so I thought I'd better finish it.

So, here's a Testing Tunes #2. I'd not be surprised if you don't know who Felix Hagan and the Family are but I promise you they are well worth a listen. I suppose the way I would describe them is 'storytelling glam rock with a heavy dose of musical theatre influence'. I mean that in the most positive way, I promise you. It's melodrama with lyrical flair and I love it.

I was on the last few minutes of my slow cumbersome run from the train station into the office the other day, playlist blaring through my headphones on random just to get me to even move my feet. I was ready to get to a shower but wasn't sure how ready I was to start the day of work when the following song soundtracked my last few steps and entrance into the building.

(apologies for a Spotify link, I couldn't locate it on YouTube)

It's a song I've listened to dozens of times before but for some reason on this drizzly morning I noticed a specific line in it which got me thinking in a way it hadn't before.
'There's no free ticket for the rock star with the most convincing please'
What this said to me basically was that timidness will not help you get what you want. Don't complain if what you want is hard to achieve. Demand to be heard. Impress people. Do it with jazz hands and get the job done. As software testers, if you're relentlessly campaigning in the name of test advocacy to a team or company that's not fully convinced, you may need a bit of inspiration and a healthy nudge every now and again to get yourself re-invigorated. For me, this is a song which helps me with that.

If you're reading this still, please listen to the lyrics of the whole song or else the rest of my ramblings could seem off topic. The whole song to me is about loving what you're doing. It's about encouraging you to have enthusiasm for your craft and put your brave face on to try and make an impression and get what you want. Whilst I think that can probably be used as a guidance for lots of professions, I don't do anything else, I test software so that's what I related it to. I think this spoke to me because frequently we alone have our point of view in a conversation or team. As a lone voice it is useful to have self motivation to keep advocating for testing, even when it's hard or no-one is listening.

The other lessons I think this song can help remind me is that sometimes you need to steal from those you admire to be the version of yourself that gets the job done. It's not disingenuous to alter your behaviour or approach in different circumstances in order to achieve your goals. (I promise you that message is in the song as well.)

Try being proud of your actions. Try thinking about whether you would want junior members of staff to learn from your example. If you would want new testers copying you, keep doing what you're doing. Do it more, do it proud and do it with gusto.

So try embracing the jazz hands. Put your best smile on, stand tall and go wow the world.

Sunday, 8 April 2018

Testing tunes

Sometimes it is possible to draw inspiration for your testing from strange places. It is also possible to find meaning in art that wasn't there when the creator let it out into the world. Recently I had both of those moments at the same time.

One of my favourite musicians of all time is English folk punk troubadour Frank Turner. His new studio album Be More Kind is scheduled from release at the beginning of May and as such he's started building some hype and releasing some of the songs. So, I was listening to one of the newly released songs, the title track in fact.



I wasn't thinking about testing at all when I was listening to it. I'd already heard this song half a dozen times before but this time one of the lines jumped out at me.

'So before you go out searching, don't decide what you will find.
Be more kind, my friends, try to be more kind.'

I know that Software Testing isn't what Frank was talking about when he wrote these words but it instantly struck me as a good motto for exploratory testing. I've been talking about exploratory testing quite a bit over the last month, both in work and at Software Testing Clinic (http://www.softwaretestingclinic.com). As a result maybe of having it on my mind, this just seemed to fit somehow.

So whilst not looking for any related meaning, what I found unexpectedly was a good strategy for your approach to testing, and it was two connected ideas:

  1. Before you go out testing, try to be aware of cognitive biases. (So before you go out searching, don't decide what you will find.) 
  2. Be kind to the people around you who have made the product. Be sensitive of other people's feelings, even if it's hard, just try. (Be more kind, my friends, try to be more kind.)


To start with, try go into your testing with an open mind of what you'll find. It could bring you some interesting, important discoveries. Also, don't assume the worst of your colleagues. Of course, the last six times you got a build on this project it was riddled with issues, but this time it might be better, be positive towards it. You never know what this change in approach will result in, new things you'd never noticed.

Software testing can sometimes been quite combative even if just in the way people talk about what they're doing. So is this not a useful thing to remind yourself, to be kind to others? It is easy for us to forget the human element of a system when we find problems in a design, an idea or a piece of software. Someone has put effort into making it, and that they may be emotionally invested in it's success. It can be useful to remember this when providing feedback on what you've tested and what you've uncovered.

I suppose the thing that I'm really trying to say is, take a kindness into the way you approach your testing. Sure, be ruthless in your quest to uncover unknown risks but be respectful to the people you encounter and those it affects. Assume that everyone has done their best and be nice to people regardless of what you find.

It's worth saying that some people may say "Dave, that's rich coming from you." To those people I would respond, "I know that, but if it was something I already thought I was good at I wouldn't need a musical reminder." I may also say: he says 'TRY to be more kind' , so it might not be easy, but you should try anyway.

Anyway, those are my words for you today.
'So before you go out searching, don't decide what you will find.
Be more kind, my friends, try to be more kind.'



Sunday, 7 January 2018

What do you want from me

So, this post is a response to something that I read in a blog that was shared in work.

https://thelifeofoneman.com/startup-tester-survival-guide

It got under my skin a little and it's taken me a while to fully comprehend why. Let's be clear, I've never worked as a tester in a startup, though I have frequently worked in teams where I've been the only tester.

To begin with I thought that I disagreed with the whole sentiment of the post. I've reread it, and found that I don't have a fundamental problem with the core of the post but that there is a bit at the end which has stuck with me:
Remember: The role of the tester is not to find issues, it is to ensure that the product quality remains at a high standard and never accept that something cannot be improved for the next iteration.
So, I spent a long time believing this, that it was my job to 'ensure that the product quality remains at a high standard.' I found this, personally, in the roles I've had, to be very poisonous. Unless you have the power to stop the release train and get the things fixed that you wanted fixing, trying to 'ensure quality' is going to be a thankless task.

So you shouldn't care about quality at all? You can care about quality, but if you do not have the power to decide what's going into live you can't ensure it, you can only try to inform on what the current state is.

So, you think the role of a Tester is to just find issues? No. I think that the role of a tester is fluid and different depending on the rest of the team. The times when I'm happiest and I think the most productive, is when I'm doing the following

  1. Challenging assumptions
  2. Investigating the software
  3. Providing information on how the software work based on investigations I've performed
But not every job is like this. Not every role is like this. Most importantly not every team will allow you to always do this in the ways you want to. Everywhere is different. I think, the thing that has stuck with me was how final the statement sounded:
Remember: The role of the tester is not to find issues, it is to ensure that the product quality remains at a high standard and never accept that something cannot be improved for the next iteration.
For some people this isn't what your role is. In fact, in most of the roles I have had, this hasn't been my purpose. It's frequently been a product owner who is decided what quality level the software is at, I'm simply informing them to the best of my abilities.

I suppose what worried me, is that if I'd have read this 5 years ago I'd have thought I was failing at my job because the quality wasn't improving no matter how hard I tried and I'd have dug in harder and become more frustrated and unhappier with what I was doing. So, what I would say is, this might be your role, but it might not be.

Wednesday, 8 November 2017

Do you not want to be a developer instead

This last week I've been doing a little scripting to help me create test data. It's given me a very good reminder of why I'm not a Software Developer.

I can 'code' to a certain extent. I know the fundamentals: I did a computer science degree so I can come up with three languages I know I don't like. But I just don't find a great deal of joy in writing software.

I've not yet finished the tool I'm building and though I enjoy every time bits of it work, I don't find the formation of the solution as satisfying as maybe I should do. The part that's been most interesting so far has been finding that a piece of software I'm testing doesn't do what I assumed it does. In fact it does something weird and that's potentially a problem. Finding that quirk felt rewarding in a way that deciding how to do things in a reusable, maintainable and understandable way just doesn't.

Over my career so far I've been asked the following question at least a dozen times "If you can write code, why not be a Developer rather than a Tester."
I've given plenty of answers in the past but none of them feel as complete as the following painfully extended metaphor.

(Firstly, understand that I know almost nothing about planes, guns, war or medieval carpentry. And one of those things isn't relevant.)

When people ask a Tester whether they want to be a Developer instead they're sometimes making a fundimental mistake. They think I'm a copilate, but I'm not, I'm a gunner. The developer is flying the plane, sure, and I'm along for the ride but I'm not wishing I could fly the plane. I'm shooting down issues and doing something totally different. Sure I could fly the plane if really needs be but it's not what I'm good at, and I don't think I'm ever going to care that much about it. I like the shooting stuff part though. In the same way the pilot could man the guns but they're not going to be as good at it as me and they are always going to wish they were flying instead, because it's a different role.

Without a gunner, you can still get to your endpoint and maybe your copilate will hit a couple of targets due to luck. But what you really need is someone dedicated to it. You want someone who loves the miticulous work of thinking about the best way to hit the target, sitting in the plane whilst it flies, and truly nailing it when the time comes.

Monday, 23 October 2017

The guilt of the stopping plates

I believe that Testers can be involved in the development process in a miriad of ways. But does this cause me to feel guilty sometimes? Wait, this might not make any sense. Let me explain.

Testing for me can be a whole host of tasks. For instance I enjoy doing exploratory testing and I can sometimes find some pleasure in running prescribed regression tests. I love getting involved in the early discussions of a piece of work and I understand the importance of supporting a feature going live. Investigating issues in the software, whether that is preshipping or in live, can also be a great thrill.

I've worked on teams when I've been involved with looking at the monitoring and I understand the benefits of engaging with customer feedback. I've written integration automation tests at multiple different levels of the software stack and I can frequently be found advocating for good build pipelines.

Whilst I am talking about advocating for things, I believe that testability is everyones responsibility because it benefits everyone when it is done well. I've dabbled with contract tests and I think performance testing is important too. Which leads us on to security and privacy, which I wish I spent more time on.

Above I've listed at least 15 separate elements of testing, and I'm sure if I sat for long enough I could come up with 15 more. That is a dangerously large number of plates to keep spinning at once. Try visualising keeping that many plates spinning. See, it's foolish right?

But I think all those things above are really important, I think that if I didn't at least think about them I'd be a bad tester. In fact, it's disingenuous to speak about that as a hypothetical. It's not that I imagine I would consider myself a bad tester if I dropped any of the plates. I know for a fact that whenever I catch myself not having done one of them I do more hand wringing than is sensible. I blame myself and believe I'm terrible at my job if any of those plates stops spinning and hits the floor.
 
What I should do is consider that if I tried to just do those first 15 things every day that would allow for around 30minutes for each task. Clearly it's unreasonable to expect anyone to achieve that level of context switching. I wouldn't expect it of anyone else, but I always expect it of myself.

Maybe in this world where testing means a lot more than running manual test scripts we should sometimes remember that it can't always mean engaging in all of the practices all of the time. It would be more realistic to just keep some of the plates spinning and accept that the others aren't in play. Or delegate them, get a developer to spin some of them most of the time and you just check in every now and again. Or maybe just forgive yourself sometimes for only being human.

Even as I write this I know when that next performance bug or hole in coverage is noticed I'll forget this logical reasoning all over again. But maybe we should be okay with this because the joy of realising testing can add value in more ways has come with the burden of noticing when you're not playing your part in all of those ways.

It is only words. And words are all we have

Words are important right? The words we use to talk about the things we do are important. They have to be because we spend considerable effort debating them. 

This is something I've discussed with colleagues quite a bit over the last couple of months and I think it's interesting. When you're trying to grow a testing team in a culture that doesn't fully understand what testers do, which is probably every culture in all fairness, being able to have an agreed understanding of your purpose and methodologies as a discipline is really useful. Obviously, agreeing on things is easier to do when you're all agreed on your vocabulary. This feels to me to be not all that controversial.

However sometimes certain terms can acquire a bad reputation if a team has had a bad experience of them. I know this first hand from seeing how people who've had unproductive encounters with 'BDD' react when you start using some of the associated words. As a result I worked in a team where we'd frequently have 'Kick-off' meetings for a ticket. This would be a process involving multiple disciplines who would sit down and discuss a ticket before we worked on it, collaboratively adding some Acceptance Criteria  to the ticket, discussing what and how we'd develop and test a feature. There were frequently some front end automation tests created and some regression tests to add to a pack that could be used as a kind of documentation of what the feature now did.

It doesn't take a genius to see that although we had very actively dropped the terms '3 Amigos' and 'Scenarios' we were still doing a large chunk of the core elements of a software development practice which a large section of the team had discarded as a disaster. Had this team salvaged the working parts out of a wreckage of a previous practice left burning in the ditch? Or were they basically repainting the car and driving around in it, strongly proclaiming that their previous vehicle had driven itself into a tree and they had nothing to do with why it had crashed and no interest in fixing it (ignoring the fact that they were still using it?) 

Today I was watching Atypical on Netflix, and at one point the father goes to a support group for parents of kids with autism. He doesn't regularly go to the sessions and therefore gets corrected multiple times for using words in a way the rest of the group sees as unacceptable. They have all obviously agreed on a ubiquitous language over time and the way he was talking about his life and family was deemed to be offensive and lacking enlightenment. Basically he was just trying to help his family and understand his place but he was being judged because he didn't understand how this closed group were using language. The problem he has is that they have grown sensitive to certain words and are unable to see though his phrasing to the intent of what he's saying and so the conversation becomes unproductive.

Is this a problem we have? I certainly know how much the phrase Quality Assurance makes me want to launch into a lecture on how I reject that label. Do we need to be careful about discounting people as 'not our kind of tester' just because they haven't become like linguistical clones of us?

The more I thought about this the more I realised that what's important is to give people the opportunity to explain what they mean. Scratch beneath whatever buzz words people are or aren't using and you may have more meaningful discussions. Yes, it's a bit more work but it's worth it because communication is not aways as universal as we think and not everyone understands the strict lexicon your tightknit group has formed. Whether that is a development team, a testing department in a company or a local test community.

So yes, they are only words. And words are all we have. Use them carefully but forgivingly at they same time. 

Sunday, 22 October 2017

How many columns do you need on your scrum board?

How many columns do you need on your scrum board?
I saw this conversation come up on a slack channel discussing Agile Testing and I almost replied, but I realised I was about to rant, so I started putting it into a blog post. Here's the end result. Hold on,  it's predictably ill-informed and rambling.

So... How many columns do you need on your scrum board?
Well, I have worked in a team where we essentially had (TODO, In Progress, Ready for Release, Done). I think in a place where you aren’t in control of releasing whenever you want, that’s the best I would want. I always want to treat (DEV+TEST+CODE Review) as just, ‘Doing the thing.' So that should all be one column, in my opinion, for what it's worth. And it's worth a great deal in this tiny space of the internet remember, because I'm in charge.

So, over time I'd grown to believe that fewer columns meant a team working closer together and that's what I always wanted to strive for. And then I was working with a team where we had (TODO, in progress, Ready to Test and Tested) and  I found myself campaigning for more columns. Which obviously I should hate myself for.

But I had a good reason. The developers would throw everything into ready for test, but the deployment to our test environment required collaboration with other teams and a bucket load of manual steps. As a result, we would have things which had been put into ready for test which would sit there for almost a week before they were even built never mind deployed to a place testers could see them.

It was frequently being stated that we didn’t have enough testers, or that we were being made to work on other things. Basically, according to this point of view Test was proving to be a bottleneck in the process. From my perspective, our main problem was that we couldn't actually get our hands on things to test.

So we introduced discussing separating up the board to distinguish between dev finished and ready to test. Normally I would be arguing for just getting the team to work closer or get everyone in the team understanding what could be done to make it quicker to get things to test after the development was finished.

Sometimes, however, some people in the team are reluctant to change, and your last resort is to get the board to accurately reflect reality. This can help highlight issues. You can strive in general to work a certain way but certain circumstances lead you to want to something totally opposite to that.
Can I tell you that this worked, that the team worked better? Actually, it's difficult to say. The end result wasn't where I learnt my lesson.
Can I tell you, with hindsight, that I support my decision to argue for this change to the board? Yes. Totally.
Did the situation mean that I think that every board should have separate 'ready to deploy to test' and 'ready to test' columns? No, not at all.
So, how many columns do you need on your scrum board?
As few as possible, until you need more than that.