Thursday 12 May 2022

The Guilty Tester - Podcasts - Month 3 Part 2 - Learning Challenge 2022

Over at the Guilty Tester Podcast I have embarked on the challenge of listening to TOO MANY technical podcasts in a tight timeframe.

Here is a list of the podcasts and the accompanying links. These are the links for the second episode chronicling this strangely difficult challenge that I set myself.

16. Women who change Tech: Episode 11: Isabel Evans - A Woman Who Is A Storyteller
https://womenwhochangetech.libsyn.com/episode-11-isabel-evans-a-woman-who-is-a-storyteller

17. AB Testing: Episode 156 - Customers and Quality
https://anchor.fm/abtesting/episodes/Episode-156-Customers-and-Quality-e1fkvbt

18. Security in 5: Episode 1172- cisa must patch list is something everyone should follow
https://securityinfive.libsyn.com/episode-1172-cisa-must-patch-list-is-something-everyone-should-follow


20.Software Engineering Unlocked: Running a Developer community

21. The Testing Show: Six personas in software testing to avoid

22.That's a bug!: The Buggiest game of all time

23. Quality coaching roadshow: Clare M Goss
24.Maintainable: Idit Levin -Production is the real test

25. The Testing Show: Do We Still Need the Phrase “Agile Testing”?

26. Test and Code: Who should do QA?

27. Maintainable: Jerod Sano - having to maintain your own cleverness

28. Smashing security: 267: Virtual kidnapping, two helipads, and a naughty Apple employee

29. Security in 5: Episode 1173 - FCC Adds Kapersky As A Threat To National Security

30. ATP: That's where the magic happens

31.Smashing security: 268: LinkedIn deepfakes, doxxing Russian spies, and a false alarm

Thursday 31 March 2022

The Guilty Tester - Podcasts - Month 3 - Learning Challenge 2022

Over at the Guilty Tester Podcast I have embarked on the challenge of listening to TOO MANY technical podcasts in a tight timeframe.

Here is a list of the podcasts and the accompanying links.

1. Test and code: Exploratory Testing - Feb 8th 2022
https://testandcode.com/179

2. Test Guild Automation Testing Podcast: The Reality of Testing in an Artificial World with Angie Jones:194
https://testguild.com/podcast/automation/194-reality-testing-artificial-world-angie-jones/

3. The Evil Tester Show: Automation Biases - Episode 16
https://www.eviltester.com/show/016-automation-biases/

4. Testing One-on-One with Rob Lambert and Joel Montvelisky : E33 - The Pitfall Of Estimations And How (try To) Avoid Them
https://qablog.practitest.com/podcast-pitfall-of-estimations/

5.Testing Peers Podcast Episode 50 - community
https://testingpeers.com/?p=1749

6. Quality Coaching Roadshow : Episode 17 Quality Coaching Roadshow - with Abby Bangser
https://www.spreaker.com/user/charrett/abby-bangser-podcast

7. That's a Bug! Episode 13. Is your last name True?

8. AB Testing podcast - Episode 155: Quality Coaching with Anne-Marie Charrett.

9. RBCS-Free Webinar: Discussions Testers Should No Longer Be Having - Two Points of View at Two with with Rex Black and Mike Lyles

10. Software testing unlocked: Tests to find bugs

11. Tech Tales . The Asus Eee PC. from 27 feb 2022.

12. Tech team weekly - complexity is killing software developers. 

13. Freedom matters: Working solo - Rebecca seal. 

14. Test Guild by By Joe Colantonio: The Complete Software Tester with Kristin Jackvony

15. Testing Peers: Episode 51- Persuasion.

Wednesday 4 December 2019

TestBash Manchester 2019 and The 8 Things I Learnt From Standing Up For Too Long

In October I went to Testbash Manchester 2019, one of the things I was doing there was taking part in the UnExpo. I thought I’d talk about my experience. So welcome to The 8 Things I Learnt From Standing Up For Too Long.
So, I was taking part in the UnExpo. What’s an Unexpo. Well actually let’s start with what an Expo is. The Expo at lots of conferences is the exhibition bit. Generally it’s a room in the venue, normally just outside where the good stuff is happening. It’s where lots of people sell their tools, their companies or their services to you. At most testing conferences it’s people selling you test management software or "magic click-and-replay" automated testing tools. 
Fun fact: the Expo-type hall in a Brass Band Contest is where people will sell you the written parts poorly arranged pop songs, ugly uniform jackets with tassels on the shoulders or valve oil. For making valves oily. 
But I digress: back to Test Expos. 
MoT(Ministry of Test) decided they wanted to try something where people weren’t just being sold things but were talking about ideas. So at Brighton 2018 they tried an UnExpo. Basically it’s like a science fair, people turn up with big posters, or make them on the day. And then they talk about whatever ideas they want. After being pleased with the results at Brighton 2018 they decided to try again. And so they did it at TestBash Manchester 2019. 
There were a large variety of tangentially testing related topics and activities being covered :a security quiz; programming a robot to go round a predetermined  track and one surprisingly nervous individual was trying to discuss with people the things about Testing that make them feel guilty.

Yep. The person getting people to talk about guilt was me. For those of you that don’t know, I have this theory that when people go to conferences, read blogs or go to meetups they see how other people are doing “Modern Software Testing” and they feel guilty for not living up to those ideas in their day to day job. My idea for my UnExpo stand was simple: talk about the things you feel guilty for not doing and put them up on post-its on my board. The more things we all admit to, the more we all understand that everyone feels guilty about the same things, we’re not alone, and to hopefully we all feel a little happier with how we're doing. That was the theory anyhow. Well, what did I learn from the day. These are just a few of the things that I learnt, and this does not cover all the things people wrote down, it's just a brief overview of some of the things people said. In fact I'll also encompass nuggets of wisdom not derived from specific individual interactions or post-it's.

1. Some People Are Worried About Not Doing Enough Automation
There were quite a few people who talked to me about how they don’t do automation, some of them don’t do it because they can’t convince their teams. Some people don’t do it because they don’t like doing automation. But there was a theme of people feeling guilty because they don't do automation.

2. Some People Are Worried About Doing Too Much Automation

That Says: Am I focussing too much on automated testing.
In contrast to the people who are worried they are not doing enough automation, some people are worried they’re doing too much automation. Which I think says it all really. As a profession we feel damned if we do and damned if we don’t.

3. Some People Like Not Knowing What’s In The Box
Interesting, for me, this one: that people feel guilty for just enjoying testing things without not knowing what’s inside. I think that the deeper you understand the Software architecture the better you get at understanding it’s quality but I also accept that a driving factor for what people do with their days should also be what they enjoy, so this leaves me a little conflicted.

4. If You Stand Up In A Busy Room And Try And Project Your Voice For A Long Time You Will Get A Headache
No post-it for this. Just an observation from me. Over the course of the day I did probably around three hours of standing up trying to talk groups of three or four people in a very noisy room. After the first hour I had a very noticeable headache that made it hurt whenever I subsequently spoke. I had that headache pretty much till I woke up the next day. FUN!

5. Some People Are Worried About Learning The Wrong Things
At a conference full of people talking about all these excited new software development techniques and practices you should learn, some people were worried about not spending time on the best ones. Basically not knowing which new ideas are the ones to chase is causing people to not commit to learning anything and I think that's a shame.

6. We All Think We're The Only People "Doing It Wrong"
The first post-it it was something someone wrote about feeling guilty about.  That they feel like they've never been in a team doing agile well. That seemed to be a theme, that "everyone else is doing something magically right apart from me". Although Neil Studd was talking about Testing generally rather than Agile I agree with him and think his point can be extended, that "no one really knows what right means."

7. A Programmable Robot Is More Interesting Than I Am
I suppose I could have guessed that. That area had more traffic than mine, however in fairness I had few occasions where I wasn't talking to anyone so I’m not sure how much busier I’d have wanted to be.

8. Taking Part In An UnExpo Is Fun But Hard Work
I would encourage you to get involved in this kind of thing. It forced me to talk to people I didn't know at a conference, which I’m normally not good at. Honestly, I hate approaching people I don't know because I always feel that I’m ruining their day and that I’m not who people want to talk to. So having a purpose for interaction was nice, it was good to have structure. And also I got a free ticket, (which may not always be the case but was for me), which is very good, so I got to see the rest of the conference and was forced to challenge myself socially. 
--------
For more info on MoT Unexpo see this write up from 2018
Also, I've so far done two episodes on the things people divulged and break down my opinions on them and dig into many more of the post-it's. There's one to follow in the new year as well. 






Sunday 14 October 2018

Can you just run a Full Regression on this please

Recently I saw someone ask for a 'full regression.' It is probably the first time this specific person has uttered the phrase in my presence but I've heard it from quite a variety of people for a plethora of reasons. It's a term that has begun to frustrate me and it seemed like it was time I got my thoughts down as to why I am am experiencing this mild irritation. Maybe you don't care that it is causing me annoyance. Maybe you aren't seeing this anywhere so it doesn't seem worth talking about. Maybe you should appreciate that this blog isn't about your problems it's about mine, so you can either enjoy me pontificating on problems in prose or go elsewhere for undoubtedly higher quality content.

The source of my problem is that when someone asks for a "full regression" I'm not sure I understand what they think they are asking for. Or perhaps the problem is that I have a belief that I do understand what they are asking for and think they shouldn't be asking for it. And yes, I have probably just made this more complicated rather than less.

So, let's work it through. When someone asks for a "full regression" what are they really asking for? Do they really want you to run every scenario you've ever thought of? Surely they can't really want you to go through every possible path you've ever thought of in the product. Will that even be possible in whatever sensible time frames you have? What if you run routes you have not executed before? If you run through paths that were previously unexplored how can you possibly say if it actually works the way it worked before the change if you have no idea what the behaviour used to be. So then either your mission to look for the regressions in the software is dependant on having at some point in the past done a surprisingly thorough job of testing or right now you have to perform the testing on a version of the software without the changes and one with. That might not even be possible. Maybe that's what's being asked for though.

Or are they really saying when they request or suggest a 'full regression' is  "This change has been so vast and this next release is so important that you should spend infinite time getting us as much information as humanly possible about how the application works." This raises issues of infinite time frames requiring you to live for an infinite number of years, putting your ability to complete your testing within your life span seriously in to question. But it could well be what they intended. After all they did say "full."

Or do they mean "I have no idea how what I've changed will impact the product because I don't actually understand how this thing works beyond the section I changed. Basically I'm working blind here. Therefore can you just take all of the risk that I've created by telling me it's "okay"? . Just so long as we all understand that I don't have a good definition of what is intended by the word okay." That could well be what they mean right? It's always what I interpret it to be. And that makes me grumpy.

I don't know whether other people have heard this term used or in fact used it themself but I find it troublesome. If when someone tells you that the change they've made doesn't have any new functionality but you just need to run a full regression that should instantly set off alarm bells.

It might be worth discussing with them how there's no such thing as a full regression, for all the pedantic and sarcastic reasons Iisted above.(You may want to phrase it less argumentative though). If they are still talking to you after that you could ask them if they could help you understand what they've changed you could do some targeted testing and get a better idea of whether their change has had any negative affects. Surely that's what they really want anyway. It's just easier to pretend that there is a magic risk free way of achieving this. It is unfortunately our job in this situation to remind them that magic is a lie and that to pretend otherwise leads to problems. Tell them it's impossible to test everything because that's likely to be infinite testing. Try and help them understand that all testing is actually limited to a restricted area and is therefore in a sense targeted. When you don't understand what you're aiming at you are targeting your testing at certain areas, they just are less likely to be the right ones. Ask them to help you set the targets intelligently together based on what has changed. Any work you can do as a team to naroow the sights of your targeted testing a little will increase the chance of it being effective.

Above all else resist just shouting "WHAT DOES FULL REGRESSION EVEN MEAN?" whenever anyone uses the term, but challenge them politely, respectfully and calmly. It's important because if people use the phrase and you just nod and tell them you've done it they will keep asking for it and no one will ever resolve what the problem is. Also, they could actually mean "this is the most dangerous change anyone has ever made to anything, what do we need to do to reduce the risk?" and you'd never know how scared you need to be about the change because you haven't asked them to clarify but have assumed the worst of your coworker.

So what is a "full regression"? It's a lie people tell themselves when they don't want to admit the truth. Try admitting it instead, it's hard but it's probably safer and it's definitely more genuine.

If you want to hear me talk about testing some more and interview people about their guilt around their own testing why not check out my podcast. It is the primary reason I don't blog here that much at the moment. That and life getting in the way.
theguiltytester.libsyn.com

If you want to shout at me and tell me I'm wrong about this I am surprisingly receptive to that. Tweet me @allcapstester

Thursday 10 May 2018

Non perfect projects

So I have been thinking quite a bit recently about testing in situations where things aren't perfect. I have a 'creative project' potentially in the pipeline to talk about this more but whilst that is still in it's embryonic stage I wanted to shove something out into the world.

Before I continue I wanted to start by saying that I believe that everyone is entitled to their own opinion about what makes a Tester an 'Agile Tester.' With that said, for the purposes of this post, let's all just agree that my definition is what we're working from. Largely because otherwise it gets complicated, and also because I'm right.

I believe that practising 'Agile Testing' means interrogating the solution all the way along its journey. From before the code exists. Probably from before the description of what someone first thought the problem was in fact. In order to do this you want to be part of a strong development team which works collaboratively. I love being part of figuring out what a solution is with a team and testing it all the way from the beginning up to the point where it goes live. (And beyond.) I enjoy exploring how things work and figuring out the nuances that make the software interesting. I'm a big supporter of the idea that a key part of a testers role is giving the team information about how the product works to enable decisions about whether that's what's desired or not.

A while back I worked on a team for a few months where I was testing how a solution integrated into a larger group of products. (Due to the structure and the nature of the project all of the statements I'm about to list were true. I promise you it wasn't realistic to change most of these things.) I hadn't really worked on any of the teams that make any of the products. This included the development team  making the product which I was checking for integration. I was not invited to their sprint planning and I was not invited to their retros or standups. I was testing the culmination of how two different teams worth of work fitted into a dozen other pre-existing products, each with their own teams. And I didn't really belong to any of these teams. I didn't have defined stories to test or a clear way of getting things fixed when I found problems. So obviously I had stopped being an 'Agile Tester.'

Except had I? There was a project standup daily with representatives from the business side and some technical leads from some of the products. So that helped propel things forward and I had a voice in that meeting where I would feed back what I had been working on, discuss what I was planning to do next and get my blockers off my chest to see if anyone could help me resolve them.

I was not working to any tickets and that was hard because it required self discipline not to get lost in rabbit holes. But I was constantly exploring new things and investigating what was happening in comparison with what I was expecting to happen. Because of this I was questioning my expectations constantly. In fact, in many occasions we were lacking convincing oracles so I was hunting them out and assessing them for their trustworthiness. This part was frustrating at some times but also had moments when it was very fulfilling. If testing is about learning how things work when no-one knows, boy was I learning about how things worked.

I would meet regularly with the product owner to discuss his priorities and he would steer me in the direction of what he wanted me to learn about. Generally it was "Can you check whether part A works?" But he and I both understood that we didn't really know what 'part A works' meant so a large part of what he was asking me to do was go learn about something and give him information and context so he could assess whether he was happy with what was happening.

If I found issues I didn't have a dedicated development team round me to fix them. What I did do was discuss these with the product owner regularly so we could prioritise the ones that would derail the release schedule and figure out what teams we needed to talk to in order to get things resolved.
Frequently I would then coordinate with other developers and testers in other teams to come up with solutions to problems and help manage them into reality. I would sometimes test the changes in isolation on the system where the fix was being applied or how it integrated with the software I was primarily concerned about.

So this role I was doing was a little frustrating and to begin with I was concerned that I was just being a quality gate at the end before it went out. To a certain extent maybe that was true. And for that part I had become the kind of tester I don't like being.

On the other hand I was working very closely with the product owner. I was constantly exploring new things and I was being pushed to interact with lots of different teams.  That project caused me to build relationships with people in a number of different teams within the organisation. These relationships have proven to help me greatly with other things I've worked on since.

But was it Agile Testing? Maybe not. But not all situations are perfect. When I started to embrace what I was doing, particularly the elements of it that felt true to how I think about testing I started to enjoy what I was doing more. It is quite easy to get wrapped up in all the things in your situation which are not good practice and that can lead to thinking that you're not really a modern thinking tester. Obsessing over all the things in your project or company which don't fit your idea of good software development culture can encourage you to compare yourself with people you follow on twitter or blogs you read or speakers you see at conferences who seem to be doing "real testing". Comparing yourself to these people can make you conclude that you're a fraud. It may cause you to get angry at everyone around you and try and solve all the problems at once. Because after all, they are making you betray our core principles of what you believe testing should be.

It's worth remembering that when people tell you a story they provide the evidence to support their narrative and miss out the facts that muddy up the details. Those testers who appear to have ideal companies and fully supportive, testing-enlightened teams are likely to be doing similar things you're doing and making similar compromises you're making, but it doesn't fit the narrative they're telling. So maybe don't judge your reality against other people's stories.

Finally, I'll repeat what I mentioned before. Embrace the parts you enjoy and that you think fit with how you want to work. Sure, try to improve your situation but try being realistic about what you can achieve with the level of power you have and the resources you have available to you. And lastly, don't beat yourselves up if you don't think you're doing perfect Agile testing. I'd bet you that no one is really.

Saturday 28 April 2018

Testing Tunes 2. Embrace the jazz hands.

Okay this is the second post in this odd series. I have a more testing specific post knocking about in my head but I'm trying to figure out how to trim the ALL CAPS ANGER out of it. Whilst I'm doing that, this arrived partly formed so I thought I'd better finish it.

So, here's a Testing Tunes #2. I'd not be surprised if you don't know who Felix Hagan and the Family are but I promise you they are well worth a listen. I suppose the way I would describe them is 'storytelling glam rock with a heavy dose of musical theatre influence'. I mean that in the most positive way, I promise you. It's melodrama with lyrical flair and I love it.

I was on the last few minutes of my slow cumbersome run from the train station into the office the other day, playlist blaring through my headphones on random just to get me to even move my feet. I was ready to get to a shower but wasn't sure how ready I was to start the day of work when the following song soundtracked my last few steps and entrance into the building.

(apologies for a Spotify link, I couldn't locate it on YouTube)

It's a song I've listened to dozens of times before but for some reason on this drizzly morning I noticed a specific line in it which got me thinking in a way it hadn't before.
'There's no free ticket for the rock star with the most convincing please'
What this said to me basically was that timidness will not help you get what you want. Don't complain if what you want is hard to achieve. Demand to be heard. Impress people. Do it with jazz hands and get the job done. As software testers, if you're relentlessly campaigning in the name of test advocacy to a team or company that's not fully convinced, you may need a bit of inspiration and a healthy nudge every now and again to get yourself re-invigorated. For me, this is a song which helps me with that.

If you're reading this still, please listen to the lyrics of the whole song or else the rest of my ramblings could seem off topic. The whole song to me is about loving what you're doing. It's about encouraging you to have enthusiasm for your craft and put your brave face on to try and make an impression and get what you want. Whilst I think that can probably be used as a guidance for lots of professions, I don't do anything else, I test software so that's what I related it to. I think this spoke to me because frequently we alone have our point of view in a conversation or team. As a lone voice it is useful to have self motivation to keep advocating for testing, even when it's hard or no-one is listening.

The other lessons I think this song can help remind me is that sometimes you need to steal from those you admire to be the version of yourself that gets the job done. It's not disingenuous to alter your behaviour or approach in different circumstances in order to achieve your goals. (I promise you that message is in the song as well.)

Try being proud of your actions. Try thinking about whether you would want junior members of staff to learn from your example. If you would want new testers copying you, keep doing what you're doing. Do it more, do it proud and do it with gusto.

So try embracing the jazz hands. Put your best smile on, stand tall and go wow the world.

Sunday 8 April 2018

Testing tunes

Sometimes it is possible to draw inspiration for your testing from strange places. It is also possible to find meaning in art that wasn't there when the creator let it out into the world. Recently I had both of those moments at the same time.

One of my favourite musicians of all time is English folk punk troubadour Frank Turner. His new studio album Be More Kind is scheduled from release at the beginning of May and as such he's started building some hype and releasing some of the songs. So, I was listening to one of the newly released songs, the title track in fact.



I wasn't thinking about testing at all when I was listening to it. I'd already heard this song half a dozen times before but this time one of the lines jumped out at me.

'So before you go out searching, don't decide what you will find.
Be more kind, my friends, try to be more kind.'

I know that Software Testing isn't what Frank was talking about when he wrote these words but it instantly struck me as a good motto for exploratory testing. I've been talking about exploratory testing quite a bit over the last month, both in work and at Software Testing Clinic (http://www.softwaretestingclinic.com). As a result maybe of having it on my mind, this just seemed to fit somehow.

So whilst not looking for any related meaning, what I found unexpectedly was a good strategy for your approach to testing, and it was two connected ideas:

  1. Before you go out testing, try to be aware of cognitive biases. (So before you go out searching, don't decide what you will find.) 
  2. Be kind to the people around you who have made the product. Be sensitive of other people's feelings, even if it's hard, just try. (Be more kind, my friends, try to be more kind.)


To start with, try go into your testing with an open mind of what you'll find. It could bring you some interesting, important discoveries. Also, don't assume the worst of your colleagues. Of course, the last six times you got a build on this project it was riddled with issues, but this time it might be better, be positive towards it. You never know what this change in approach will result in, new things you'd never noticed.

Software testing can sometimes been quite combative even if just in the way people talk about what they're doing. So is this not a useful thing to remind yourself, to be kind to others? It is easy for us to forget the human element of a system when we find problems in a design, an idea or a piece of software. Someone has put effort into making it, and that they may be emotionally invested in it's success. It can be useful to remember this when providing feedback on what you've tested and what you've uncovered.

I suppose the thing that I'm really trying to say is, take a kindness into the way you approach your testing. Sure, be ruthless in your quest to uncover unknown risks but be respectful to the people you encounter and those it affects. Assume that everyone has done their best and be nice to people regardless of what you find.

It's worth saying that some people may say "Dave, that's rich coming from you." To those people I would respond, "I know that, but if it was something I already thought I was good at I wouldn't need a musical reminder." I may also say: he says 'TRY to be more kind' , so it might not be easy, but you should try anyway.

Anyway, those are my words for you today.
'So before you go out searching, don't decide what you will find.
Be more kind, my friends, try to be more kind.'