Thursday, May 29, 2014

Agile Finland, it's THAT time of year again

About a year ago, I volunteered to join Agile Finland Executive Committee. It's been an interesting year for me, one in which we've done plenty of things and started more that will continue into the future. A new executive committee will be chosen in Annual Meeting next week (finally), and while the style of opening may give the impression, I'm not going anywhere.

Recently, I've come to know people who volunteer to join the Executive committee for 2014-2015 season in addition to me: Hannu Kokko, Olli Pietikäinen, Jussi Markula, Vasco Duarte and Martin von Weissenberg. This means there's already one more that the executive committee can accommodate according to our rules, and I voluntarily step down to be a spare member - eligible for full membership in meetings when any of the others is away, and always being there. As the meeting has not been held, it is still possible things will turn out to be different, just like last year. I showed up just moments into the voting, volunteered and kicked out someone who had been preplanned into the committee.

The time of year leaves me thinking about what I'd like to see Agile Finland be. The ideas are most certainly influenced by others, but this is not intended as a shared statement from people but a personal view to open discussions.

Agile Finland should be more professional, run 'like a business'

Agile Finland community is about content. To run Agile Finland, there's much more than just content, there's all sorts of administrivia, financial considerations and considerations of what we can do as a non-profit. I'd like to see that more of the stuff that happens in Agile Finland could be considered paid work for someone, starting with the hope of hiring a 'secretary' and continuing with the idea that Agile Finland could actually own a company or collaborate closely with a company. Organizing conferences would be more sustainable if only the content part would be volunteer work. Organizing regular events similarly.

We need to look at models of how non-profits are organized from the bigger non-profits.

This also means, to me, that our events should be profiled to professional, career-advancing stuff. Not something that happens only in evenings and on your own free time. Social aspects are important too. Unconferences, peer conferences and the sorts are social. But they are not freetime activities.

Agile Finland should celebrate and support skills/personalities diversity

I think Agile Development is the way of the future - people emphasis. I joined with a concern that has not been resolved: Agile teams seem to be leaving non-coding testers out. Again and again I hear of some enlightened customers who buy agile development teams from one contractor and add testers from another. But I hear more stories of customers buying just agile development team and not realizing they may be missing something relevant. I love the fact that there's teams doing really great without testers, the fact that developers in agile have learned to test. But I find it sad that there's still a feeling of excludedness for those who identify as 'testers'. Someone I talked with summed it well: testers were struggling with respect 10 years ago and it got better. Agile feels like it's starting all over again.

When I look at Agile Finland events and participants, I get a feeling that something has changed there too. When it all started, a lot of the stuff was about practicing our skills of development (coding dojos in particular). Now we have very little technical content, and very few active developers in the ranks. Most of our contents are about team work, people stuff, methodologies and practices. And coaching, that seems to be the core of it all.

I did a little picture of interest groups I could identify already existing within Agile Finland as I see it. I'd like to find a way to create action for different interest groups without creating walls on which groups you belong to - we're all in it together and just choose to spend time in different way.

I'd like to see more people identify with Agile Finland and agile as the way to do valuable software. While others may find it important to reach out to ones outside development, I'd like to make sure we don't lose out on the ones who are in development. I hope we could have both.

What it comes to testers, I find that many of my colleagues in test won't join as enthusiastically as I did. Also, doing as much on testing as I want to keep the skills growing and new people joining into the great work of testing, Agile Finland alone does not feel sufficient. Thus I've started Ohjelmistotestaus ry - Software Testing Finland -non-profit, that works with skills-oriented testing, a topic that should be close to Agile Finland as well. Knowing testing, growing testing and building bridges to testing is still my calling.

And the others...

On the upcoming season, I want to run a trial on agile development from 'creating together' point of view with 1st graders - small children. I've actually already agreed on piloting this with a local school. The idea emerged from the fact that I feel we lose girls in particular at a very young age from the software professions. To me, it seems some of it is peer pressure and some of it is the idea that "code" is the central element. I'd like to focus on people and I don't mind that we together transfer ideas into code. Ideas are the key. So Agile Finland will have a new audience, the future of the profession in kids.

Similarly, we need to reach out more towards the students. I think Turku Agile Day has been the most energetic of the agile conferences in Finland, and I attribute a lot of that to the collaboration with students. We should have more of them, early on, learning with the more senior ones on creating great contents.

And surely, we should also seek to infiltrate the places that don't yet realize agile has answers to their concerns - the very traditional non-profits. Practical examples, real cases and the discussion about them could perhaps teach us all something. Turning non-agile to agile is just not my thing outside the actual work at companies. Selling Agile is someone else's high priority.

Wednesday, May 21, 2014

A tester crisis with shift-left work

There's been a long-time discussion about involving testers earlier, but the term "shift-left" is a new concept for me, saying basically the same thing: the need to involve testers early not to find as many problems later in the lifecycle.

I've done a share of shift-left work in the role of a tester, commenting and pointing out flaws. But as I continued working on design stuff, I started noticing that I moved from my tester role ("here's a problem, could you think of a way of fixing it") into a designer role, suggesting solutions. I find the difference essential, as in the tester role I tend to remain somehow disconnected with the eventual solution I did my best for, whereas in the designer role the mistakes we make are mine as well.

Just yesterday the first feature designed by me got into testing - to be done by me. I felt different. I felt quilty for liking the appearance of simplicity of the feature. I felt it must be incorrect. I felt I must have missed something really relevant. I felt insecure.

We did a pair testing effort on the feature, with quite limited time in relation to the complexities related to that. About half-way through our paired session, I felt the urge to mention that the design / fit for use aspects needs to be attacked very critically, as I fear for my bias. I noticed I selected my own test approach for the session based on the concern of bias - comparing with a version of similar feature in a previous version of the product, with different technologies. I paid particular attention to things the old one allows the user to do, and making sure those could be done with the new. And I felt even more strongly that if the previous wizard had 9 steps and the new only 5 steps, there just has to be something wrong. And I kept looking for evidence. I've logged a bunch of bugs (13 from our session yesterday) but none of those bugs yet has what I look for - signs of this design not working for its purpose.

With something I've created, I feel the need of doing my worst to break it is more urgent. I will try. I will invite others to try. And monitor the feedback just as I do with any other features.

I realized I've seen similar shifts of perspectives turn bad earlier on in my career. I've seen decent testers turn into confirmatory mindset by participating in design, missing relevant issues while testing. So I feel that it's my duty to do the extra work needed to try to tackle the concern. If I fail in my attempts to show it does not work, it just might work.

Then again, its no extra duty. It's what I always do testing features. The extra is just the nagging feeling that my bias needs attention. Will see if it did in just few weeks.


Tuesday, May 20, 2014

Developers worth a (better) salary test

Recently, I've spent quite much thinking time with the idea of 'being a professional'. I (re)read Robert C. Martin's book "Clean Coder" to get reminded that as per that book, professional developers estimate and test, and play well with others. The book left me also thinking that professionals are people with stories to share of their learnings, and strive to be on an improvement path towards something better.

I've also been thinking about people who are not as professional as one might like them to be. In particular, a tweet just a few days ago left me wondering:
It helped me realize that this is very much true in my experience as well. As much as I work with agile-minded organizations and colleagues, I still keep meeting many many developers, who don't do much of automated unit testing. Or if they do, they do it because they have to and just look for opportunities to maintain the unit test base by deleting most of it on a basis of 'architecture change'. there's individual developers that I trust and respect that test - either test first or test soon thereafter continually.  But 90 % of my sample is still people who think testing is trying out something very little if anything, and trusting that it either works or someone else will tell them about it.

There was one developer, who started off telling me that he is too valuable to test. In his perspective, those who can't code are less worthy. The less worthy in this particular case were product management people, same people who should actually spend their time speaking with the customers and making sure there's a steady flow of income to pay for the development. Saying that is less worthy seems off by much, as there's no chance the developer would get a single customer commit to paying anything for the product. Different is not inferior. The attitude was not a very professional one. I would expect just a little more of perspective, seeing things in context.

There was another developer that was (in)famous for his ability to do quick prototyping. The negative side of it was that while he could make the system run once with the happy paths, his code was a maintenance nightmare. There were hacks and little structure. Any way he could make it work before moving elsewhere was the deal. Other developers quietly fixed and compensated after him, complaining in the background. Making things work now without considering maintainability is not very professional either.

And there was yet another developer. His skills were not that toned but he did his best. He just wasn't very enthusiastic about his craft. When sent to a course he requested, he listened a bit but skipped all the exercises. His learning and enthusiasm was quite low, and requiring paired development in the team is his nightmare - he'd rather work solo. Again, not a very professional approach either.

Thinking through all the problems I've had over the years with developers producing more bugs as they attempt to fix something, developers having no clue of what could break with their change and developers creating maintenance nightmares, there seems to be two things in common.
  1. None of these developers test and think testing. Rather, they externalize testing as somebody else's responsibility. They also tend to like to externalize requirements and designs as someone else's responsibility.
  2. None of these developers practice their craft. They don't talk to others about how to do things better, and submit themselves to feedback and learn actively. They already have "the tool" and that seems to be enough. 
A bad, unprofessional tester wastes his work time into producing very little value. When a tester is bad enough, the rest of the team won't talk to him with various excuses. The bad tester sits in the corner, creates test cases and runs them, creating test cases statuses. A bad tester with a good developer is wasted effort, but not a problem. Bad tester - Bad developer is a pair that seems good enough, as someone needs to do even the most basic checking. But if you've ever seen anything better, you may find it hard to accept. In my experience, in this case the bad developer gets better by removing the tester.

A bad, unprofessional developer causes problems in a much bigger scale. There's no sitting in the corner when your code ends up in the end product and fails. There's the approaches I've seen where you code just a little every week but avoid doing much - to keep the scale in which you fail smaller. 

I'm not looking for the perfect developer. We're all people, with different skills and backgrounds. But if you don't practice, hone your skills and actively get better at what you're doing, could you please consider doing something where you cause less harm to others? Just care a little. 

Testing is learning. It's a thinking tool. Developers who externalize testing delivering just-don't-work-quality systems to testers miss out on being more valuable. And I'd like to believe that we're learning, as industry, to pay better for those who take a bigger chunk of the value chain focusing on throughput instead of sitting quietly in our presumed silos. Be it 'design', 'code' or 'test'. Developers who test should be paid more. Or, developers who don't test should not be paid as much as they are paid now before they learn to be worth as much as their test-able counterparts. Full-stack developer - extending beyond code&test - sounds good, but how come I see so few people like that? It seems developers who consider themselves agile are ones building communities and practicing seriously in dojos and retreats, and unconferences, seeking out mentors they feel they need, in- and outside of their organizations. Or again just my sample?

Within testing specialty, there's a lot of talk about T-shaped testers: doing either business analysis or development as the other thing to testing, extending your skills. That seems relevant as in adding the value you can deliver and thus what you should be worth in salary.  The old-school test case executor testers should be really low pay (or extinct by automation). Active and thinking testers are needed and respected. Professional. 

Saturday, May 10, 2014

Test Documentation and Mindmaps

I have a thing with documentation - I like it. I like to read books that deliver me new concepts, teach me basics of how to do certain things and make me feel inspired. I read articles, and have a soft spot especially for experience reports that don't tell how things should be done elsewhere but outline how (and perhaps why) they were done in one particular place. I like the fact that people put their ideas on paper - with pictures and text - to open a discussion in software projects. And I really enjoy working to identify the core of information that needs to be written down.

I'm turned into a fun of avoiding test documentation. In my main project, I write the end user online help to remember things I might have put in test documentation because it makes the document shared. And it appears people actually do read it. If I have details to write down, I'd prefer to have them written down as comments and implementation into automated tests. And for difficult information on "how was this supposed to work", I regularly volunteer to add stuff on the specifications if the information isn't end-user oriented.

I create test documentation too. I create mindmaps at a point when I know the least. Mindmaps are a cheap form of documenting, they allow me to change my mind as I learn, and they support the learning that happens. When I think I've learned, I transform my mindmaps typically to two documents: a checklist that supports going through the test ideas as the software lives on that usually is in a spreadsheet format. And a short summary document with pictures and text of central concepts I need to quickly remember when I've forgotten all about this area working on another one. And I write reports on issues I notice, and categorize them on their relevance.

As testing is learning, I've noticed over the years that I need to throw away my early documentation structures that I create when I start to test. In the early documents, I put in the stuff that I'm told / read from other sources that exist. Stuff that usually isn't very insightful or complex - the complexity starts to build on top of those. The insights start to happen at times I write bug reports. I learn that things were not as I expected and I log a bug. The insights, if they get collected, get muddled with all the mundane information. And as I come accustomed on my structure of where I write what, I no longer refactor the information for other audiences. The document I started creating works well for this feature / change right now, but in my experience it doesn't serve the needs of those who come later.

Quite a long time ago, I learned to think of test documentation as an output of testing, not input into testing. Test documentation that I'd like to see created serves me when I need to get up to speed with an area other professional tester has tested before me. It delivers some, not all, of the learnings in a condenced format allowing me to get up to speed faster. I can't accept that every tester would start from scratch, there must be ways to accelerate and take the fast lane. Similar documentation I need when I work on an area I haven't had time for in a while - I forget and need a quick reminder when coming back. I believe it's professional to leave things in a better shape than what they were when you entered and care for the one that comes after you, even if no one explicitly tells you so.

This week, I was working on a feature that I realized is similar to something our teams' other tester has worked on. So I needed the summary of the feature and core lessons from that feature already being tested. I digged in for the documentation to find three mindmaps that don't make sense. Surely they list features, but why three? Why they are split this way? Why isn't there just one that I could work with, that would have the up-to-date view. And when I looked at the details of those, there was nothing insightful written down - insightful to me. I also looked into specifications, to notice those do exist, but were out of date. Even if I update specs regularly, I have not emphasized that it could be done by others too. And they were long and boring, with anything insightful that I as a tester could use somewhere between the lines. I know how to read to see things that are not said, but I would love the two page summary instead of the 44 pages.

With this behind me, I tweeted:
I'm sure I'm attacking mindmaps just because I'm disappointed in the quality of the ones I looked into in comparison to my expectation at that time. It's hard, if not impossible, to guess the future needs. But in this particular case, looking at the mindmaps again and again, I'm sure the trouble is the mindmaps describe structures from the time we knew the least (before testing) not the time we know the most (when we've tested). I would expect us to have the discipline to spend a moment in the end to make sure that what we leave is - critically looking - the best we can with the latest information and a reasonable amount of effort to balance value / cost.

So I looked around more to realize all documentation, unless I specifically have commanded an other format, is a mindmap. Even the low tech dashboard I've adviced to create is a mindmap. Which gives me the impression - unconfirmed - that there may be a liking of a tool instead of actively thinking of the format in place. Mindmaps are a tool, that brings in a format for the documentation you create with it. I'm not sure if they would work well to capture stories. They seem to work well for listing features and subfeatures, and their relations. And when they have lots of detail, using them to keep track of what you covered just isn't what you could do with a checklist format.

I appreciated what James Bach pointed out:
I think it should not be a hidden point. In this particular case there never was the bloated traditional documentation. There never was a constraint that stopped us from creating the best imaginable documentation for future - except our imagination. The imagination was lacking in thinking of audiences and times of use, focusing just on what I need and can create right now, while testing. I think a big part of the reason why this happens is that there's an over-reliance on mindmaps. The hidden point being the main point might help with the choices.

In the information I was looking for, in all honesty, I could have used a mindmap. It would have just needed to be a mindmap that structures knowledge from the time after test when you know the most. It would have had different contents. But I could also have used a picture of the process embedded into the feature. No text would suffice. And if I would start testing that area, a checklist would be nice. Mindmaps are a poor format for checklists. A checklist, for me, is something that usually has multiple times to tick it off that I want to keep  track of: builds, environments and such.I would not have used step-by-step instructions on how to test, but the ideas of why and what to test, and how to make sense of it all quickly.

The fact that I can rework all the existing documentation again to learn the basics just isn't good enough. I want a better mix of things.

Wednesday, May 7, 2014

Volunteering - an active approach

On my walk to the office this morning, I realized that my tendency of volunteering has played a significant role on how my career gets built.

Most recently, I volunteered to improve a design - not by stating how it was incorrect or risky (testing) but by coming up with several other suggestions. I ended up owning the design of the feature and the driving the discussions to conclusions and agreement.

On numerous occasions, I've volunteered to deal with something everyone seems to be avoiding that needs doing. I've volunteered to speak out with a personal risk when I know something needs  to be said that others would want to say but don't. Volunteering is my way of contributing. I'm very seldomly assigned something to do, instead I'm actively looking at what needs doing and what I should volunteer for. I don't need to be a manager to be an active player in the direction I'm heading and we're working towards together in the team.

I realized volunteering comes naturally for me. I volunteer for boards of non-profits that I believe in. I volunteer to run a kids computer club for 1st graders to teach them about versatile, collaborative creating with computers instead of programming. I volunteer to organize meetups to meet great people that give me extra energy with their ideas and enthusiasm.

While at university, I volunteered for many student organizing activities. I held several financial responsibility positions organizing events with large audiences. So many of those lessons have been valuable, defining moments on what I do now.

Going all the way back to when I remember volunteering for the first time: my sisters needed me. I volunteered to teach her Swedish so that she would pass a class without meeting a teacher she did not get along with. She aced the exam and was barely allowed on the next class level. I felt useful.

There's so many things that would not have happened without volunteering. It's rewarding and it drives me forward allowing for all options. I suggest everyone should try volunteering. I've been surprised on how much of a difference that makes.

Thursday, May 1, 2014

Optimizing efforts into bug reporting

A colleague from another company contacted me  asking about numbers on how many bugs testers find per a time unit. I remembered checking that number for our organization last autumn for a presentation I gave, and now checked again to see if had changed. It's not our goal or connected in any way with how we set goals, but I find it occasionally an interesting trigger for thinking through the results. The number had not changed at all - we log 5,8 issues for a day. It was the same when I was the only tester, it was the same when the second tester started learning and it's still the same for the two of us combined. Somehow it's more of a number that reflects the point when we prefer doing something else than logging issues.

We also had a paired testing session for the two of us. During the two focused hours, we covered rather shallowly a new area we had not had time for otherwise to find about 20 issues. As the area was in the other tester's product (current split of responsibilities is that I only manage on this product's team) I left all the issues to be logged for the other tester. Later in the afternoon she mentioned that she will need to leave the logging of issues for next week, as there's resolved issues to work on and two days of out-of-office ahead for her. This triggered my favorite concern, the amount of time wasted on good and detailed bug reporting out of habit. And after all, that's what testers are taught. I replied to say I will quickly put one issue with all the bugs I saw, just copypasting my notes in that one issue. The reasons for unclear summary style reporting are many in this case:
  1. Copy-paste of notes with the unclarities takes me 30 seconds, while clear repro steps in step-by-step style easily take 5 minute each. With 12 in the queue, that's an hour of work!
  2. Unclear reporting triggers the devs to talk to me. And they do, often. The positive impact of those discussions have been significant. 
  3. The oneliners tend to be already enough to get the bug in the first place - most of the time. Not writing for the audience but on template wastes effort. 
  4. If I reported now, the devs could already fix them. And some of those most likely will before there is the time in calendar that would allow the detailed logging. 
  5. The rare skill in our team is to see problems. There's plenty of developers to isolate the problems if they're hinted there is a problem.
There's a huge difference in the reporting style for me and the other tester. If you've taken Cem Kaner'sbug advocacy or read articles on clear bug reporting, the other tester does all that.  But the other tester also must invest more time in logging. I write onliners describe just the relevant data, and let pictures talk for me. If my current reports were showcased to externals, they would not be clear enough. But it seems, asking the developers, that most of the time they are sufficient. And they can always trust me to demo things for them. 

Imagine the amount of wasted effort on the detailed reports if the detail is not necessary: 5,8 bugs every day for two 25 months with 20 working days. 242 hours if 5 minutes per bug was enough. 

I'm a big fan of thinking all activities - including the selection to log issues imperfectly or leave them unlogged completely using your judgement - in the frame of opportunity cost. With another choice of allocation, could you have done something more valuable? Not just the cost of testing, but the costs in development. Bug reporting in the way most books describe is not a best practice. My style isn't either. But my choices seem to fit this particular context - for now. And I can ask, as the test manager, the other tester to start experimenting with a new, quicker style of reporting that will take her probably out of her comfort zone. 

Later addition: Asking the developers on the preference I learned two things. They find the step-by-step descriptions wasting their effort - they need to learn long stuff to get something they could get in more concise form. And they have been annoyed - without ever saying a word - that all bug reports are logged in evenings, as in they are kept in store for the whole day when there is a chance to do something about them. Know your audience. Ask, try different things. Pay attention to things that people don't say.

Yet another later addition: I seem to have offended the contracted tester's manager by making a claim of "all bug reports are logged in evenings", when the data shows that only a majority but not all of bugs are logged after developers leave office at 15.00. Just to be clear, I added a the previous report of verbally transferred information, and whether it is all or some, it is a feeling that was was stated. I reported that not to offend, but to note that the pacing of how we choose to report with regards to when we find the issues also has an impact the developers notice.

Choices of my word may be incorrect as in all vs. majority, but they are well meaning to describe recent events that reflect stuff that I have also learned over the years. I don't do safety language that well, but don't intend to stop writing because of it.