Showing posts with label tests. Show all posts
Showing posts with label tests. Show all posts

Wednesday, June 17, 2009

Testing, Estimates and Burndown...

Adam Feldman posted the following question to the Agile Alliance forum on LinkedIn:
In your experience, what do you all do in estimating for testing? Do you include the time expected to test the story in the points allocated for the actual story, or is this normally not included?

The second part of my question regards burn down charts. If we are not allocating points to the actual testing of the story, is the burn down chart really only telling us what is built - not what is actually complete?
The first three responses answered that testing should be part of the team effort and not an after-thought. To help Adam find a way to explain this for the team and management, I focused on the concept of DONE criteria:
The fundamental question here is "what is your DONE criteria?".

If the team's DONE criteria is built (but not tested), then the burndown and estimates do not include testing. This will lead to an optimized development team, but potentially create problems for testing and ultimate delivery. Things will get built faster, but ultimately may be delivered slower (which is not upper management's goal).

If the team's DONE criteria is delivered business value to the user/customer (the true agile definition), then the estimate and burndown includes testing effort. Testers should be part of the team, tests need to be run within the sprint, stories should be completed within the sprint including testing.

This isn't easy, but the second option is the mature one.

There are two pieces to overcoming this. You (the testing group), need to learn to work quicker and more closely with the people building. They (the developers) need to learn that DONE is defined by business value and not code complete. If "you" and "they" both overcome this, then everyone will become a "we" or "us". Testers become involved in sprint planning and review also.

I've blogged about this several times if you are curious:
Downstream testing
Tests after story closure
What to do with found bugs
How do testers fit in agile
I'm curious where the discussion thread will go over the next few days on LinkedIn. Anyone have additional thoughts?

Monday, June 8, 2009

Automated Acceptance Tests...

There's an interesting article on InfoQ about automating acceptance tests. It is hinting that this practice hasn't had the successful following that some of the other XP practices have had. The article appropriately ends with this conclusion:

Now, consider if the tests written by the QA department are written before the development begins. The information provided by these scenarios now occurs at the beginning of the iteration in a predictable fashion. Therefore the uncertainties are reduced, velocity becomes more stable (fewer random interruptions), and that means more predictability.

So, are automated acceptance tests jus something the elite (or lucky) have been able to make work? Is there an internal flaw that is unseen that has caused it's less-than-stellar adoption? Or is it just difficult with proven benefits and a practice that every software development team should aspire to adopt?

Having attempted to do automated acceptance testing with my last team in a healthcare setting using FitNesse, I left the following comment:
... The value of automation is the repeated run.

Automated acceptance tests can have a high value for high data exchange (as opposed to screen manipulation). For example, test a signup or registration form for all of the required fields, field level logic, etc. With regression testing, this insures data integrity. They are also a good fit for testing REST interfaces or other public accessible API's.

Don't use them for drag and drop UI testing or color/stylesheet testing.
Do any of you have experiences in this area?

Thursday, March 26, 2009

TAD vs. TDD vs. paired TDD

This is a repost from Jeff Langr where he simply models how TDD is quicker than TAD and paired TDD is quicker than either of the first two.

Simple but effective.
TAD:
CCCC xxxx TTTTT xxx IIIII RRR II

TDD:
TCTCTCTC xxxx IIII xxx RR I

Paired TDD:
TTCCTTCCTTCC xxxx xxx
Confused? Read the full version here.

Tuesday, March 10, 2009

Dale Emery has spoken...

For whatever reason, Dale Emery blasted out a whole pile of blog posts this morning (update: he just modified his blog structure which caught up the feed!). I started to follow his work after attending his "Resistance as a Resource" session at last year's Agile Conference in Toronto.

Two of his posts stood out because they discussed how to approach a technical system and think of it as a planned response system. He goes on to define system responsibilities, essences, events, and obligations.
The definition a system’s essence makes no mention whatever of technology inside the system, because the system’s essential responsibilities would be the same whether it were implemented using software, magical fairies, a horde of trained monkeys, or my brothers Glenn and Gregg wielding pencils and stacks of index cards.
In a second post in the series, he goes on to discuss the anatomy of responsibility further. Both are good reads, and I encourage you to check them out.

Staying on the Dale Emery kick for the day... I also enjoyed his post about "testing as an information service." He focuses on the quality assurance and testing team roles and how they affect the surrounding team.
Testing is an information service. The point of testing is to inform stakeholders about the system. This is not a new sentiment, nor does it originate with me.
He goes on to tell a story about one student in his class and how he realized the goal of his role is not to rub developer's faces in found bugs, but to inform the team and stakeholders equally of what is broken AND what works as expected. Testing is a litmus test, not a contest.

hmmm... I'm going to repeat that last one again and stake claim to the quote... "Testing is a litmus test, not a contest."

Friday, March 6, 2009

Friday highlights...

In case you didn't see them:
  1. Achieving Agility Needed for Business Survival (Info Q) - 8 values & ethics, along with a good comparison between agile and the restaurant business.
  2. Accounting for bugs the agile way - part II (ASD) - simple summary by Jack Milunsky of how to handle bugs in your process... yes, they are on the sprint backlog... no, they don't add to velocity.
  3. The Ideal Workspace (Cohn) - what is needed for a team (of developers) to fulfill their basic workspace needs?

Tuesday, March 3, 2009

TDD is good...

Most of you probably follow InfoQ and have seen the recent article referring to the study showing that TDD improves quality. The data seems to show that it is 15-30% slower to develop using TDD, but the defects seem to decrease on the magnitude of 40-90%. Most managers would agree that the defect savings would more than pay for the increased cost. The study had very comparative control subjects, so the data should be considered valid. A peer of mine mentioned that this type of TDD was not as mature as could be, and therefore would imply the gap might be even bigger with TDD mature teams.

Also, Mike Bria made an attempt at creating an elevator pitch to pronounce the values of TDD. His pitch is a little longer than my typical elevator ride, but it is a good explanation for those techies looking for a better understanding.

Tuesday, November 25, 2008

Non-Functional Requirements...

If you haven't heard, there's been some buzz about non-functional requirements in both the community (InfoQ) and from the experts (Mike Cohn). I'm also seeing some activity in the Agile Alliance LinkedIn group.

Wednesday, November 12, 2008

Google's Agile Tester...

Ever wonder what it is like to be a Test Engineer at Google?

It might sound like a job with that title would be a slave to banging on the keyboard and trying to break the system, but instead you find a story about adding real value and making life easier for the development team.

If you work in a larger organization going through an agile transition and everyone is trying to figure out the new role of the quality assurance group... read this post to get insight into a good example of how it might work after the transition.

Monday, November 10, 2008

First-hand TDD experience...

TDD (Test Driven Development) is one of those things that I endorse, but can't teach since I no longer code (crap, there went my reputation in the community). I understand the value of it, I've seen it done successfully, and I can recognize when a group turns that corner and it just clicks for them.

It's great when you run across a first-hand experience to help explain TDD to newbies, especially when it contains the reality that it takes some time to learn. Brian Genisio put up a good post about his first year using TDD and his turning of that corner.

Wednesday, October 15, 2008

How do testers fit into agile?

I'm a strong believer they must be part of the team.

If you have a testing organization within your company, then before agile was implemented you were probably used to pitching "completed" work over the wall and crossing your fingers. Then there was a period of quiet before the wave of bugs came flowing back over the wall.

Going agile is about decreasing the distance between these two points drastically. Actually, TDD (test driven development) is about removing that distance completely. With this philosophy, it is important for a team going through an agile transition to deal with this issue. Instead of the test team relationship being "us and them", you have to find a way to fold the test resources directly into your team. Resources should be slightly more dedicated and must participate in planning and sprint review. Even if they don't aid in prioritization, planning, and design, their presense in the room insures that testing is built into the DONE criteria for sprint work and the timelines are realistic. Their insight can catch issues early, especially those surrounding performance or mis-use of the system. They get a view of what is coming down the pipe so that they can get ahead of the curve and stop being behind the eight ball.

My best experiences with testers on a scrum team occurred when they attended the daily scrum. Also, if they sat with the team throughout the day, they could mentor the team on their unit and acceptance tests to insure the basic logic and validation was covered. They spent more of their time catching the "really hard to find" bugs. They could modify performance and load testing scripts before the work was complete so that system tests could be run almost immediately instead of always being a sprint behind.

The sooner you embrace non-developers into your agile team, the sooner you see these types of benefits. These same points would hold true for usability, user interface, or business analysts.

If you want to know more about testing in agile, I just read a great post by rfleming on whether testing is about uncovering defects or changes. It triggered me to write this because I like his points and they are insightful, but I believe the issue he discusses can be reduced if you strive towards the points I raise here.

Monday, October 13, 2008

Darth Vader was an agile coach too...

I swear I'll stop with the Star Wars references, but I'm not the one making these up and it is complete coincidence that they are back to back...

















Courtesy of Sebastian Bergmann on Flickr

Monday, September 22, 2008

Improve your testing with 5 for 5...

Someone sent me this link on the Search Software Quality site about the differences of agile vs waterfall related to quality and testing. It's not written in a style that normally lands in my reading list, but there are some interesting tidbits contained inside.

One great little gem deep inside the article is about building respect on the team around peers and testing. This link led to a great blog post about the "five bugs in five minutes" game that is a fabulous idea.

In case you don't want to follow the last link, here's the summary:
  • I (developer) think I'm done
  • I challenge a peer to review my work
  • If they find 5 bugs in 5 minutes... I buy them lunch

Why?
  • TDD and auto-testing is good, but it's not creative like the human brain
  • 5 minutes is quick
  • I learn from my peers (like pair programming)
  • It's cocky (I challenge the best because I believe there aren't any) and therefore fun
  • It encourages your team to have lunch together and build stronger relationships.
Check it out... I'm going to see if I can convince my team to try it.

Friday, September 19, 2008

JUnit starter guide...

For anyone trying to figure out TDD, especially unit testing, JB posted his JUnit starter guide for free today.

Monday, September 15, 2008

Iteration/Sprint != Waterfall...

This discussion about code reviews has triggered a bunch of thoughts in my head. For now, I'll stick to the loudest one.

Sprint boundaries are supposed to provide transparency to the customer and the business, not be a scramble for shippability. The end of the sprint shouldn't create a flurry of check-in activity leading to a pile of new tests to find previously unknown bugs (which can't be fixed in time for review).

Instead, the daily meeting/scrum should provide a heartbeat for the team and every day or so, stories should be completed. Adopting agile should convert you from yearly releases to quarterly releases, to monthly releases, and maybe even weekly releases.

Have you ever flipped a rock in the woods and watched all the critters scramble and disappear? This is not what the day before sprint review is supposed to be like!

Instead of thinking of the sprint review as a point in time to have stuff working for demo and approval; you should be thinking of it as a time to have the customer, product owner, and team assess progress and determine if the upcoming plan, priorities, and communicated delivery dates are still realistic.

Here is another great list you can use to spot an agile team.

Monday, September 8, 2008

Downstream testing...

No, no, no.... and no.

The word "downstream" and "test" should not be near each other in a sentence.

Kevin Rutherford had an interesting post about how better testers allowed the developers to become lazy. As humorous as this was (and I can definitely see it happening), I started to think about what he was saying. Especially as he asked: "What did you do to harness the skills of your great testers so that they constructively support your great coders?"

I got to thinking about the flaw and whether or not it was obvious to him. Even if it was obvious to him, maybe it wasn't to readers of his post. So I had to comment.

Tests need to be run before the story is accepted, reviewed, and closed. If you work in an environment where a testing team runs manual scripts to do testing, then this is included. If this means that your story isn't closed for a while, then so be it. If this is painful to you, then work as a team to shorten that time frame and remove the waterfall.

Paul Richie had some similar thoughts (thanks for the reference Paul).

Tuesday, September 2, 2008

Tests after story closure?

As a customer I tend hang out in the VersionOne google-group user forums. Sometimes I find myself providing agile coaching feedback unrelated to the tool. Today was one of those days.

A peer was asking questions about the tool when running tests. But their team regularly closed stories in one sprint and had the testing team test that work in the next sprint. I had to share the following feedback:

"Flags are raised in my head whenever someone says that their testing team works on the prior iteration's work.

This is problematic for the following reasons:
- the development team has moved on to new work
- if the testing team finds something, they must "interupt" the development team to go back and fix the bug
- this throws off velocity (or it is gamed and false)
...and this whole concept feels like a mini-waterfall since it ignores that the product should be "potentially" shippable at the end of an iteration.

What we did on my last team was have a CI (continuous integration) build that ran every night separate from the check-in test/build process. If everything passed, then the test environment was automatically updated in the middle of the night. Thus, our test team was part of the sprint team and they tested stories as they were built. Stories were not closed unless Dev, Test agreed AND the customer accepted them. This removes the pitch-over and splash-back feeling.

It can also remove a lot of interruptions and debates that pit QA against Dev.

Having said all of that- you may not technically be able to do this today. Or, your culture may not be ready to truly be agile. As a pragmatist, I say whatever you are doing is better than not. I just wanted to throw this out there as a potential goal to strive for. "

I'm glad I said the last part... my peer responded with appreciation but noted that they were working through the technical hurdles to creating an automated build.

Monday, June 30, 2008

What do to with found bugs...

As a customer I tend hang out in the VersionOne google-group user forums. Sometimes I find myself providing agile coaching feedback unrelated to the tool. Today was one of those days.

The discussion was surrounding the tool and what to do when a tester finds a bug.

Option 1: reopen the old story and tasks
Option 2: enter a new story or task to fix the bug.

I pushed the discussion towards the process issues and impact.

My summary point -
If the customer/product owner can live with the bug and it is not critical to release, then option 2 might be acceptable to save time and focus on prioritized business value.
But if delivery can't be reached without the bug being fixed, and especially if the bug was injected while working on that story... THEN THE STORY ISN'T DONE.

It's painful to re-open a story and go back to something the team believes is done and wants credit for. But this is our job. The quality criteria is deliverable and working.

If you buy a burger and it has a fly in it... you expect it to be fixed or replaced. You don't go back to the counter and get charged for a new one. Just like the restaurant doesn't get to charge again, neither do you. Your team's velocity is no different than money in a restaurant.