Image via Wikipedia
Are you a gamer? You know, the guy who secretly sneaks away after dinner and turns on your X-Box or PS3. The alter ego sets in and off you go to kill Nazis, Zombies or turn a lap faster than Mario Andretti (I imagine many young techies have no idea who Mario Andretti is). Maybe you are the greatest mock-rock-star since Slash and can wail out solos on your little plastic guitar in Guitar Hero or Rock Band. Or maybe your one of those Facebook virtual farmers growing virtual crops.
I used to be a gamer, but to be perfectly honest, I don't think I've been excited about gaming for about 20 years now. My son loves to game, and he tries to get me to do it. I visited him last year up at Berkeley, and he got me to play Guitar Hero. He was amazing... I sucked. At one point, I turned to him and said "Look, you're so good at this you ought to learn how to play the guitar for real!" He wasn't interested in guitar playing, he just wanted to win the game.
The interesting thing about people is regardless of how good or bad they are at gaming, once you understand the objective of a game and how it is played, you can always get a little better. It doesn't matter
what the game is. As long as it provides a score,
you can improve it with a little effort. And certainly, such is the case with IT workers. They are master gamers!
We recently switched one of our projects to the Agile methodology. Agile has its roots in Lean and Xtreme programming principles. Its a great methodology for getting people to creatively collaborate and get work done more effectively. I decided to take our organization into Agile for two purposes:
- Better Quality
- Greater Throughput
I'll talk more about Agile in future blogs, but for now, lets just accept it as a really cool methodology that will delight programmers once they get the hang of it and get comfortable with the exposure it brings.
One of the fundamental practices in Agile is to estimate the effort of the tasks the team is about to undertake in a group setting. Mike Cohn, has a great youtube video that will teach you how to estimate by playing "Planning Poker." The purpose of planning poker is to estimate tasks as a relative measure to other tasks. The theory is that its pretty darn close to impossible to guess the absolute measure of something, but it is far easier to estimate things when using relative measures.
For example, ask your team "How tall is the Empire State Building?" The answers will be all over the place, from a few hundred feet to maybe even a mile high. Now, show your team a picture of the New York Skyline featuring the Empire state Building, and ask them "How many of those buildings would you have to stack up to be as tall as the Empire State building?" You will get very accurate and consistant answers.
That's a fundamental concept of Agile. People understand relative difficulty of tasks with greater accuracy than the specific effort required to solve any one particular task. In an Agile environment, you don't really care too much at first how accurate the estimation is... you let that develop. You measure that with each sprint (a time-boxed length of time your team will work). The team will complete as many tasks possible in your backlog of items. And, once the sprint is complete, you measure their velocity (throughput).
So, there we were at the team's first Agile Planning Poker event (okay, it was a meeting). Everyone was given a deck of cards. Each deck contained the following cards: 1, 2, 3, 5, 8, 13, 20, 40, and a 100. Items from a prioritized backlog were brought up one at a time. The team discussed each item for just a few minutes. Then everyone was asked to go to their deck and pull out a card that represented the difficulty of the task. Note, you have to pick a card or stay completely out of the voting (people who had no idea how to solve a particular problem were encouraged not to vote). No numbers can be selected that are in-between card values.. like 7's. Its either a 5 or an 8! Pick either the 5 or the 8. Remember, this is all relative anyway.
10 cards were pulled. I saw a 5, a bunch of 8's and two 13's. In Agile, you ask the outliers why they voted their number. Their input is considered, then you revote. Eventually, the team came to agreement on the score of each task. It was recorded and the next item was scored.
By the end of the meeting the team completed scoring the relative difficulties of each of the items. We knew the priority, and our team was ready to start their first sprint. The team was asked to select as many items they believed they could deliver in a 4 week timeframe. The goal was to have working, deployable code by the end of the 4 weeks. The team picked up 40 points.
Well, we discovered that in our first sprint (the 4 week period), the team completed 45! What an accomplishment! Everyone loved the process, they felt energized about how fresh it was and became very eager to take on the next sprint. Another planning meeting and the team picked 45 points. Guess what? They delivered an amazing 59 points that month! This Agile methodology is AMAZING ! ! !
As you can imagine, I was absolutely thrilled with the results. In fact, one team reached a velocity of 99 points one month. They were so proud. So was I.... right up until we learned about the regression problems we were introducing in the code.
Even though the teams were super-enthusaistic about the new way we were doing things, they got a little sloppy in their ability to catch regression problems. Lets just say that on that part of the game, well, they kind of bit the dust.
We were measuring points for each sprint... an indication of busy-ness and effort. However, we weren't measuring the additional problems we were introducing, because, quite frankly, no one knew we were introducing problems at first.
Igor Mandrosov, my good Russian friend of 15 years now, who also happens to be our Agile coach, said to me "Your problem isn't velocity, your problem is Negative Inflow."
"What's that?" I said.
"When you run high velocity your regression error rate goes through the roof." He actually told me the ratio of velocity to negative inflow points... not a good ratio, but we are improving.
"Okay, lets start measuring it and share it with the developers. Things will get better." I said.
Gamers!
What are some of the ways that you address quality issues in your company?
Gus
P.S. Negative Inflow is a term Igor Mandrosov coined. To date, I haven't seen any tools, outside of the ones we've developed under Igor's direction, in the Agile marketplace that addresses regression errors. As far as I'm concerned, Igor is advancing Agile, and he should trademark "Negative In-Flow"