DevOps Zone is brought to you in partnership with:

Outstanding technology leader, software architect, and culture engineer with a proven track record of pushing start-ups out of idea mode and into profitability. Extensive background cultivating healthy cultures and processes remotely as well as writing code and building a team to support it quickly and efficiently. Excels in designing and implementing software systems and development processes in an environment of rapid change and growth. Startup enthusiast, agilist, adviser, cyclist, and runner. Jeremy is a DZone MVB and is not an employee of DZone and has posted 1 posts at DZone. You can read more from them at their website. View Full User Profile

QA is Dead. Long live QA!

  • submit to reddit

 This isn't specific to startups but it still applies.  I was recently asked for advice on how to go from two week sprints to one.  The conversation was one I've had several times.

Client: "We are a scrum shop that has two week sprints.  We'd like to release faster.  Any suggestions?"
Me: "Do you have a QA handoff during the sprint?"
Client: "Sure.  We basically do waterfall during the sprint."
Me: "I've got it!"
Client: "Great!"
Me: "Fire your testers."
Client: "..."

I'm only half joking.  

I used to think having a QA person was the essential fourth technical hire, adding more as needed as the organization grew.  For close to ten years that's how I'd managed teams, ensuring each team had access to at least one.  That changed last year.  We were pushing for faster and faster releases with a client and something didn't feel right.  As it happens we were having trouble keeping our QA role filled, the problem wreaking havoc with our release schedule.  It needed to stop.

We held a series of meetings to discuss our needs and what could be done to address them.  We all agreed we needed tests.  We all agreed we needed someone to own testing.  We also agreed that devs should own unit tests, but the question of if or how much in the way of integration testing they should do was a matter of intense debate.  Should we have a QA that only does spot testing?  Should a human ever repeat a test by hand?  Should we forgo human testing and instead have a QA Engineer that was chiefly a programmer?  If so how could we cleanly divide their work from the other engineers?

It was a lot to process.  During this time we plowed through countless testing resources, books, blogs, and tweets.  We hit the jackpot when we ran into How Google Tests Software, a great book on how testing evolved during the early days at the Goog and it gave us the answers we were looking for.  The sky opened.  We had been looking at QA all wrong.

I'm paraphrasing, but the problem is essentially in thinking that any part of QA is somebody else's job.  We weren't so far gone to think that engineers didn't own any of it it but we certainly weren't owning enough.  Engineers write a few unit tests and figure that's it.  Managers jam a QA person between the engineers and each release and call their job done.  The reality is if you want to avoid waterfalls entirely you've got to bake your testing completely into your code effort - not some of the tests, but all of them.  The code isn't done until the testing is.

We were initially skeptical at first.  I mean when you're used to seeing a net below you when you cross the high wire it's a little unnerving when it's gone, right?  Once we realized that having devs own the whole process meant the wire was actually a bridge there was no fear.  The need for a safety net was an illusion perpetuated by our own bad behavior.

How do you know when you've done it right?  You won't need any testers.  Having a tester from the get go creates an artificial dependence on someone else to do your testing for you. It also creates an unnecessary step in your release process. Be your own tester first. Separate QA roles should only exist once your QA needs involve a strategic planning component that can no longer be distributed throughout the development team.  It depends somewhat on your dev team and your product, but for most places this isn't until the third or fourth year.

Do the work yourself.  Design a workflow that requires developers to wipe their own behinds, by writing automated tests for and testing their own code.  Your devs make smarter decisions.  You can stop paying for people you don’t need.  You can finally get the waterfall out of your scrum.  I would go so far as to suggest that Continuous Delivery can't be achieved without this approach.  You can do without dedicated QA.  Start now.  Your code, your process, your developers, your timeline, and your budget will thank you for it.
Published at DZone with permission of Jeremy Stanton, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)


Senthil Balakrishnan replied on Thu, 2014/01/02 - 1:45pm

Very Interesting Article, when a team adopts agile methodology they quickly come to realization that Dev is not just coding, QA & Dev are not two different entities anymore.

But not many organizations claim to follow agile do the level of automation thats needed for an effective agile development. I guess thats the agile adoptions maturity level :)

Jeremy Stanton replied on Thu, 2014/01/02 - 3:24pm in response to: Senthil Balakrishnan

I would agree.  These things can't happen in silos.  It's easy to adopt the trappings of Agile but if you want to squeeze all the juice out, so to speak, changes like this aren't just nice to have, they are necessary.

Serguei Meerkat replied on Sat, 2014/01/18 - 10:02am

You won't need any testers.  Having a tester from the get go creates an artificial dependence on someone else to do your testing for you.

I don't think this is correct. The point of testing by QA is having a second pair of eyes. Someone has to test it for you if you don't want the client to find bugs you as a developer missed.

It is also important that someone who don't see the system from source code point of view to check if it works for them too.

Jeremy Stanton replied on Sun, 2014/01/26 - 11:13pm in response to: Serguei Meerkat

I agree with you.  In our process when we turn our 'homework' in at the end of the sprint we demo what we've checked in.  We use this to catch anything last minute before it lands in the client's lap.  

This demo is collaborative with the product owner and if there are subtle bugs they are caught here.  The bugs are then queued up to be fixed in the next sprint.  If they are non-trivial bugs then the sprint is considered broken and shame on the dev.  This is because it is incumbent upon the developer to make sure they built something without obvious bugs.  This may involve collaborating with the product owner.  

If between the developer and the product owner it is not possible to find and fix the obvious bugs (during a given sprint) you have bigger issues to deal with that live in your process and product ownership, rather than QA.

Jeremy Stanton replied on Sun, 2014/01/26 - 11:24pm in response to: Jeremy Stanton

Above when I say "This is because it is incumbent upon the developer to make sure they built something without obvious bugs.  This may involve collaborating with the product owner."  I meant collaborating during the body of the sprint.  It is assumed by our process that they will test the code at the end of the week during the demo.  

This works best if you have sprints no longer than a week.  If you have longer sprints or think you have to kill every single last bug before each release I would urge you to reconsider how fast can you run? 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.