I recently left a .NET gig where I was brought in to bring agile and craftsmanship behaviors to the entire project team. One of the more successful endeavors was the acceptance test-driven development (ATDD) effort. I was lucky enough to bring Joel Levandoski (joellevandoski) on board to head up this effort. Joel is an awesome developer and on this project, he was running as the lead QA resource, bringing automated testing to the group. The QA group traditionally had not done automated testing, so we had a bit of learning to do. Joel did a fabulous job of learning a new tool (SpecFlow for .NET) and training other QA people on its use.
Our QA developers created acceptance tests from the acceptance criteria specified in the user stories. We spent a lot of time evolving our user stories, but by the end of the first six months, I think we finally have a format that we like and that will facilitate communication between all the participants on the project. Our acceptance tests were written using SpecFlow, a Gherkin-compliant BDD testing framework for the .NET platform. SpecFlow is an amazing tool and its integration with Visual Studio is pretty nice. Having this integration with the IDE is a great selling point to using SpecFlow; SpecFlow specifications can be translated to normal xUnit tests using the VS integration. SpecFlow generates a stub unit test driver class for every feature file and it's compliant with several unit testing frameworks. This SpecFlow feature allows it to run directly from the various GUI unit test runners. This is a nice convenience. We used the MSTest generation strategy baked into SpecFlow.
Initially we used WatiN for testing our ASP.NET MVC app, but later migrated to an application suite of web and Silverlight applications. Therefore, we gravitated to using WebAii from Telerik. This tool worked well for testing both web and Silverlight environments.
Our specifications were very focused on actions and outcomes to those actions. We took a concerted effort to push the details of the steps into the fixture code. Keep the specifications light and to the point. We followed the technique advice from Concordion.org. Doing this really made our specifications communicate the intent of "what" was being tested, not the "how" the test worked.
Many thanks to Joel Levandoski and Tim Anderson for driving the ATDD efforts forward and making this effort a reality. There were numerous times that I would find Joel writing specifications during sprint planning as the rest of the group was reviewing user stories and associated acceptance criteria for the upcoming sprint. This is incredibly powerful to come out of sprint planning with a good portion of your acceptance tests executable (all fail as inconclusive).
Great information to say the least. I really do appreciate everything so much from this great website.
ReplyDeletePlease look here at
Best Astrologer in Malleshwaram
Nice information,Thanks for sharing this.
ReplyDeletevisit here
Best Sofa Repair Services in Hosakerehalli