Chris George (@chrisg0911) November 25, 2015
This month Lean Coffee took place at Jagex. As we were about 20 people, we were splitting up in three groups. My group discussed four topics in the hour we got, but a backlog of good ones was left over. A lot of food for thought, good advise and engaging conversations to be had for me! And this time around even with an additional treat, as Michael Ambrose, Web QA Lead at Jagex, gave us a quick tour through the building.
Many thanks to Chris George for organising, Jagex and Michael for hosting and all attendees for letting me in on their thoughts on testing!
My scribbled notes from the Meetup (with no claim on correctness!):
Testability – what does this mean, really?
- Often this dives towards software architecture and design, e.g. principles like single responsibility, aiming for independence of function or parts
- How easy can you put software into certain states or conditions when testing?
- Question upfront if something would be test/checkable, also consider automation point of view
- Likely you’ll need to extend the product or a certain feature at some point – how straightforward would the change be to implement, to test?
- Look for consistency, it can make your live easier
- Don’t be afraid when stepping into “Software Design” domain here. As a tester, to be able to test a feature is a solid requirement, something you need in order to do your job.
- Don’t be afraid to ask questions! Both developers and testers can benefit from this :)
When /not/ to get involved as a tester?
- Interesting to reverse the question – as we often rather focus on trying to get involved very early, e.g. to already test requirements
- It is a question that can be addressed to the team – where does the team think a tester should/shouldn’t get involved?
- Wherever you are involved, even when not your domain, and when you’re not comfortable to actually voice an opinion, you can bring the testers mindset – be inquisitive, ask questions!
- Do not get involved in the Toolchain discussions v.s. absolutely do get involved, e.g. in order to consider how to automated tests for a specific technology stacks, or when it covers areas like Continuous Integration
When are testers not testers?
- When they are mere checkers?
- Everybody could take on a tester role, but not ideal when it is your own code
- It’s a role/mindset thing – inquisitive, imaginative, questioning, creative, curious are areas of the tester
- Culture might be a factor though, and what you are actually dealing with, e.g. there might be an area where you have more compliance and mere checks and less room for exploratory testing (Hardware?)
- Passion often shows in a tester, e.g. on getting a valuable product to the customer
- Testing is often working your way from unknown to known (in exploring, getting to know the product, feature, its state), so that you can provide information
- When raising bugs, a “checker” might mechanically raise every issue found, whereas a “tester” weighs and looks to find and emphasis the important issues first
Software Testing Metrics – any experience which ones work?
- Automatically gathered, e.g. via code coverage tools
- Manually gathered, e.g. via listing features, prioritising these and listing time spent, issues found, general impression of stability, etc.
- Could be visualised, e.g. using Mindmaps or as Heatmap
- Charters can be useful here, as they are already time boxed – but some caution on not to limit testing for the sake of metrics – empower testers to apply judgement, allow for going off to explore.
- Think about what you are trying to achieve with these metrics. Who would look at these? What would/should this information be able tell them? For what purpose?
- You should be able to give a report on what/how you tested. Only to say “I did an exploratory charting in that area” might not be too helpful a report when you can’t answer a follow up question on “Okay, and what did you test exactly?”