Pages

Friday, June 24, 2011

Kingpin: How One Hacker Took Over the Billion-Dollar Cybercrime UndergroundKingpin: How One Hacker Took Over the Billion-Dollar Cybercrime Underground by Kevin Poulsen

My rating: 5 of 5 stars


Awesome non-fiction. Author does a great job keeping the drama and action going. I couldn't put this book down after I started reading it. Interesting to watch Max Butler's (aka Max Vision) downward spiral into cybercrime and black hat hacking. Very eye opening to read the details of how these talented hackers can cloak their infiltration and syphon information from computers for weeks or months.



View all my reviews

Friday, June 17, 2011

Zero Day by Mark Russinovich

Zero DayZero Day by Mark Russinovich

My rating: 4 of 5 stars


Pretty good cyber-thriller. I think this is Marks' first novel. Knowledgeable author. The plot is very plausible and believable. A fair amount of technical information around malware, rootkits, computer viruses. A fun read. Very quick read and I found myself not wanting to put it down, especially after I got to the second half of the book. The last third of the book is like a runaway freight train of suspense. Recommended!



View all my reviews

Sunday, March 27, 2011

Grails JAR dependencies with classifiers

Quick post on specifying Grails dependencies in BuildConfig.groovy.  The recommended way to suck in JAR dependencies in Grails is to use the dependencies DSL maintained in BuildConfig.groovy.  I had a need to bring down a dependency that has a classifier attribute on it.  Didn't really find anything definitive on how to do it, but it seemed like following a convention might do the trick.  Here's how I solved the issue:

repositories {
  grailsPlugins()
  grailsHome()
  grailsCentral()
  mavenCentral()
  ebr() // SpringSource Enterprise Bundle Repository
}
dependencies {
  runtime group:'net.sf.json-lib', name:'json-lib', version:'2.4', classifier:'jdk15'
}

Saturday, March 05, 2011

Griftopia by Matt Taibbi

I recently read this book after seeing Alan Cooper had read it and stated that it was a terrifying book.  I wondered what would be so terrifying about "Bubble Machines, Vampire Squids, and the Long Con That is Breaking America".  After reading it, I wouldn't characterize it as terrifying as much as I would characterize it as infuriating.  The incompetence, greed, self-interest, and gluttony that is repeatedly portrayed in the book is extremely infuriating to me as a hardworking American citizen that pays taxes.  The book chronicles some of the most audacious power grabs this nation has ever seen, and in most instances, those power grabs are happening during the past two decades.  Taibbi chronicles why the Tea Party is chasing its own tail, lambasts Alan Greenspan as "a one-in-a-billion asshole that has made America the mess it is today", and details the mortgage, commodities, and wealth fund scams that we, American taxpayers, have had to endure the last couple of years.  The book is written in a no-holds barred fashion with a fair amount of profanity thrown in to spice up the prose. It's an entertaining read, but also very thought provoking and sheds some interesting light on the current political climate, especially around Obamacare and the health insurance industry.  Very highly recommended.

griftopia-e1296531461300.jpg

 

 

Friday, March 04, 2011

Groovy Remote Control plugin via Maven

I had some issues getting the Groovy Remote Control plugin to pull down through Maven today.  The documentation that is currently in place today is not correct.  Here is the fragments of my Maven POM that enabled me to pull the plugin as a dependency:


 

 

Sunday, February 27, 2011

It's all about the conversations!

More pondering as I contemplate my previous consulting gig.  This time, I'm considering requirements discovery.  Pre-agile, people would write large requirements documents, hoping to document all the requirements needed for the developers to build a system that would satisfy the customers.  Unfortunately, this view of discovering all the requirements ahead of time is awfully naive.  Things change.  Requirements go undiscovered.   Requirements that are captured are not thought out as well as we would like and when it comes time to implement the requirements in software, the requirements don't make sense or are plain wrong.  Thus, large efforts to capture all the requirements has some amount of wasted effort.  This effort focuses most of the conversations at the beginning of the project.  Conversations after the requirements document has been written and signed off is discouraged; it's viewed as evidence that there are errors in the requirements document.  There doesn't seem to be any room for learning with requirements documents.

Now we have agile methods and the user story.  User stories are not requirements.  They're a planning tool for the agile team.  They are a statement of value that the business would like built.  There may or may not be acceptance criteria associated with the user story, depending on where the user story is in its lifecycle.  The most important part of the user story is the conversations that need to happen to flesh the story out so the feature value can be realized in the product.  This detail seems to escape a lot of people trying to use user stories to build products.  User stories give me the freedom to have conversations with all interested parties regarding the statement of value.  I learn just-in-time about the requirements as I implement the feature.  The business learns about how those requirements they communicated manifest themselves in a product.  And we both have the freedom to adjust, learning about the product along the way.

The above statements about user stories does not preclude one from using prepared, written documentation to feed the conversation.  My previous gig, we have a lot of federal government documentation regarding reporting and calculation requirements.  These requirements are set in stone by the government.  Yet, there needs to be conversations as to how those requirements will be accomplished across releases and sprints.  This is where the conversations come back to the forefront.

User stories allow me to deliver value to the business or customer in bite-size portions.  Value is delivered in small increments, thus I can deliver these more frequently and solicit feedback from the business or customer.  When I have short feedback loops in place, I can nimbly change my course if I need to.

Agile database modeling

My previous project has me doing a lot of introspection lately.  One of the recurring themes that I have been noodling on has been evolving a data model using agile development techniques.  The applications being built on this project are based on the .NET platform.  The development group is currently using the Database Project template in Visual Studio 2010 but looking to support a database migration process here soon.  The VS 2010 Database Project template does an admirable job of keeping track of all the DDL for your project, but it offers nothing for refactoring your database over time and migrating a production database.  It seems like the template is meant for rebuilding the database from the ground up, thus there is no concept of database schema changes or migrations.

The database will evolve over time.  Development groups should learn how to build their data model incrementally over many iterations and releases.  Database migration tools can be very helpful in your quest to evolving your database over time.  Tools like Liquibase and Rails migrations are very good at supporting this sort of development behavior.  There are tools on the .NET platform that do this sort of thing.

Another thing that has caused quite a bit of headache is the desire of the data group (data architects and DBAs) to try to get out ahead of the developers some ways and build out much more of the data model than the developers need for the current sprint.  We have found that when your data modeling efforts are not driven from user stories that they tend not to align with the efforts of completing the user stories.   Thus developers and data people end up conversing about data model changes that could have been avoided in the first place by waiting for the right time to initiate changes to the data model.  My advice for building an operational data store that one or more applications will be developed on top of: don't try to develop the entire data model upfront.  You will inevitably end up changing the data model to support requirements as they change (hopefully you're using stories to guide your development).  Typically your application developers will have constraints and needs that need to be accommodated in the data model.  As the data person, you need to be communicating with them and working from user stories.  Ideally, the data people are part of the project team and are dedicated to the agile process.  We really did not have that on this project.  I think that caused issues and slowed us down a bit.

Refactoring Databases should be required reading for development groups.

 

Great experience with Acceptance Test Driven Development (ATDD) and SpecFlow

I recently left a .NET gig where I was brought in to bring agile and craftsmanship behaviors to the entire project team.  One of the more successful endeavors was the acceptance test-driven development (ATDD) effort.  I was lucky enough to bring Joel Levandoski () on board to head up this effort.  Joel is an awesome developer and on this project, he was running as the lead QA resource, bringing automated testing to the group.  The QA group traditionally had not done automated testing, so we had a bit of learning to do.  Joel did a fabulous job of learning a new tool (SpecFlow for .NET) and training other QA people on its use.

Our QA developers created acceptance tests from the acceptance criteria specified in the user stories.  We spent a lot of time evolving our user stories, but by the end of the first six months, I think we finally have a format that we like and that will facilitate communication between all the participants on the project.  Our acceptance tests were written using SpecFlow, a Gherkin-compliant BDD testing framework for the .NET platform.  SpecFlow is an amazing tool and its integration with Visual Studio is pretty nice.  Having this integration with the IDE is a great selling point to using SpecFlow; SpecFlow specifications can be translated to normal xUnit tests using the VS integration.  SpecFlow generates a stub unit test driver class for every feature file and it's compliant with several unit testing frameworks.  This SpecFlow feature allows it to run directly from the various GUI unit test runners.  This is a nice convenience.  We used the MSTest generation strategy baked into SpecFlow.

Initially we used WatiN for testing our ASP.NET MVC app, but later migrated to an application suite of web and Silverlight applications.  Therefore, we gravitated to using WebAii from Telerik.  This tool worked well for testing both web and Silverlight environments.

Our specifications were very focused on actions and outcomes to those actions.  We took a concerted effort to push the details of the steps into the fixture code.  Keep the specifications light and to the point.  We followed the technique advice from Concordion.org.  Doing this really made our specifications communicate the intent of "what" was being tested, not the "how" the test worked.

Many thanks to Joel Levandoski and Tim Anderson for driving the ATDD efforts forward and making this effort a reality.  There were numerous times that I would find Joel writing specifications during sprint planning as the rest of the group was reviewing user stories and associated acceptance criteria for the upcoming sprint.  This is incredibly powerful to come out of sprint planning with a good portion of your acceptance tests executable (all fail as inconclusive).

Wednesday, January 12, 2011

Tweaking your user story mapping efforts

I had a great day leading a business group through a user story mapping session at my current client.  Story mapping is technique that Jeff Patton has popularized for giving your product backlog some structure.  I've done a couple of these user story mapping sessions with pretty good success.  Today, we changed up a couple of things with the session and saw some good results.  Thought it might be worth a blog posting.

First, identify your high-level activities and lay them out across a wall.  We used large Post-It sheets and attached one high-level activity to each sheet.  By doing this, we could move activities and associated tasks around the room, allowing us to rearrange priorities easily.

Next, give the customer/business group five (5) minutes to come up with as many tasks as they can think of for each high-level activity.  Time-boxing the effort keeps you on a regular cadence.  Post the task Post-Its on the large Post-It sheets in no particular order.  Move from one activity to the next, spending the same amount of time on each.  Don't worry about duplicate tasks or the prioritization of the tasks.  You'll come back to these, culling and prioritizing the tasks associated with each activity.

After harvesting the tasks for each activity, go back to each activity and cull out the duplicate tasks and prioritize the tasks according to Jeff Patton's story mapping technique.  We spent 20 minutes on each activity and were able to get a backbone of tasks defined, with other non-core tasks associated with the activity.

Another technique for ensuring that high priority tasks percolate to the walking skeleton row of the story map is to give the business people sticky dots to place on the tasks that they think are core.  We had our business folks put their initials on the sticky dots so we knew who voted up the task.  The dots stand out on the story map and the business really liked using this prioritization technique.  Having the initials on the dots gives you added information regarding who is connected to what stories.

Now that the tasks are prioritized, you can walk your story mapping and talk about it with your customers/business people.  Walking the story map ensures that the ordering of the activities and tasks makes sense and nothing has been missed.  By tweaking our story mapping session today, we were able to keep everyone in the business group engaged and the conversations flowing.

Tuesday, January 11, 2011

PeepCode has new Rails 3 videos up

If you're interested in getting up and running with Rails 3, I recommend PeepCode's videos.

https://peepcode.com/pages/rails-3-screencasts

Testing as a learning sandbox

I've been spending some quality time with NHibernate 3.0.  Last night I got stuck on an issue with the Criteria query where a collection passed to a constructor was null and the framework was complaining.  After writing a couple of integration tests that tested various parts of my domain object model, I was able to determine that collection types that I was using for the many-side of relationships (ISet<T> and HashSet<T> in this case) where the inappropriate collection types to be using for my collection semantics configuration.  I'm continually amazed at how powerful testing, both unit and integration, can be.  My tests today allowed me to create a sandbox to try things and work out a misunderstanding that I had with NHibernate.  Pretty cool.

Friday, December 31, 2010

Anatomy of a successful large agile project

I recently had a conversation with a colleague of mine at a company where I’m currently consulting.  We’ve been trying to bootstrap a collection of projects using an agile development process and associated software craftsmanship behaviors.   We have had mixed success to date.  Frustrated with the progress, my colleague asked me to enumerate what I felt were the success factors on the WestlawNext project that I had recently participated on.


I worked at Thomson Reuters from January 2008 to August 2010, during the initial releases of the WestlawNext project.  I was with the project from the beginning of its software development; the product development group had been working on the inception of the WestlawNext project for a couple of years prior.  WestlawNext was a very large project.  Hundreds of people were involved and millions of dollars were spent to build the next generation legal and regulatory research tool.  A lot was riding on this product.  It had to be a success—there was no option for failure.   The following themes are what I feel made this project a success.

 

Attitude

Now that I’ve had time to ponder my WestlawNext experience from afar, I think the number one reason for its success was attitude.  This was an audacious effort to build a new legal research tool in two years time with the number of people involved using an agile software development process.

But from the very beginning, a “can-do” attitude was instilled in the group that we would succeed.  We were going to “knock the ball out of the park” with this product.  There was never a thought that this thing might fail.  Many concerted efforts were made to continually propagate this attitude throughout the participants of the project.  Project tee-shirts, baseball trading cards, raffles, and summer socials were utilized to promote this team spirit.  This infectious attitude allowed us to overcome obstacles that would probably derail other projects.  People were willing to take responsibility for their work and put in the effort over and above the call of duty time and time again.


Communication

Communication is one of the most important functions of a software development project.  Large projects are very susceptible to communication breakdowns as the number of people increase.  We tried to minimize these breakdowns by favoring face-to-face communication as much as possible.  We were encouraged as software developers to pair program.  Designers were encouraged to work directly with developers on styling concerns.  We were encouraged to collaborate together when tough problems arose.  No GoToMeeting.  No conference calls.  Face-to-face conversations.

We were extremely lucky to be able to co-locate almost everyone on the project in three areas of the Thomson Reuters facility in Eagan.  When I mean everyone, I mean business people, vice presidents, directors, testers, designers, managers, coaches, and software developers.  This is one of a few places that I have consulted that have had the luxury of co-locating people in common areas.

 

Leadership

WestlawNext benefited from strong leadership that what 100% dedicated to the project.  No other obligations—they were focused solely on the development on the new product.  Our leaders were also quite familiar with agile software development process.  Some of them had come from other agile projects, both within the company and from outside.  They didn’t have to start from square one and many already knew the key behaviors of the process.  A few of them were software developers at one time (or still are). This is refreshing from a developer’s point of view.  They understand what it takes to build software; they’ve been in the trenches.


There is one moment in particular that I am very fond of.  I was working on a tough networking issue with some .NET code that we had provided another group.  We were throwing spurious socket closed exceptions, but it didn’t happen all the time and it seemed to occur only when the server load increased.  Our senior director, one of our leaders, was helping triage the issue and participating in our root cause analysis.  This senior director had technical chops and was quite proficient at network analysis.  He rolled up his sleeves and got right in and loved being able to help solve the issue.  We did solve the issue; it turned out to be tied to a deprecated thread-local storage API in .NET.  That leader earned a ton of my respect that day.

 

Testing

Testing is paramount to building a quality software product.  The WestlawNext project embraced testing like I have never seen before in my career.  We evolved our designs with unit testing.  We used integration testing to ensure that software components were wired together correctly.  Acceptance testing ensured that features did not regress in future iterations of development.  Load and performance testing was continuously run in an effort to tune the overall product.  Beta testers were allowed to play with the software well in advance of its initial release date, ensuring that it satisfied the customer.


All of this testing allowed us to build tight feedback loops, giving us near-instantaneous data on the health of our growing and evolving product.  The suites of tests infused confidence within the project group; we knew exactly how the software performed at specific load levels.  I cannot fathom working on a software development project that does not fully embrace the aforementioned levels of testing.

 

Conclusion

In conclusion, I’m starting to realize that the WestlawNext project may have been one of those rare moments where everything came together in near perfect harmony to produce a great product.  As I have moved on from Thomson Reuters, I yearn to replicate a similar experience at my other clients.  My current engagement only reinforces the fact that every software development project takes a different path to success, and some may never make it to the end.

 

 

 

Thursday, December 02, 2010

Sunday, October 31, 2010

Kaleidoscope diff tool for Mac OS X

Found a really interesting new diff and merge tool for the Mac: Kaleidoscope.  Native app, integrates with Versions, Cornerstone, and the command line.  Looks promising.

 

Autofixture: A generic Test Data Builder implementation for .NET

Just came across a Test Data Builder implementation for .NET, Autofixture.  The Test Data Builder pattern has become quite popular recently since it was mentioned in Growing Object-Oriented Software, Guided by Tests.  I've used the pattern before, but I've always built the builder implementations by hand.  This implementation looks really promising.

Wednesday, October 13, 2010

Great video on what motivates us

Great video on drive and motivation. Love the whiteboard drawings. Spend 10 minutes watching this video.



Tuesday, August 31, 2010

Loving my new Magic Trackpad

Just received my Magic Trackpad from Apple today.  Very impressed after a bit of use with it.  Love all the different gestures that you can map to.  After using this, a mouse is going to seem awfully archaic.

Monday, May 17, 2010

Brief history of mock objects

A brief history of mock objects, a story told by Steve Freeman. Good read as to the "why" of mock objects and how Hamcrest, jMock, and others came about.

http://www.mockobjects.com/2009/09/brief-history-of-mock-objects.html

Monday, May 10, 2010

Steve Freeman on sustainable TDD

Excellent presentation by Steve Freeman on sustainable TDD.  Lots of great tips for making your unit tests easier to comprehend and maintain.

http://www.infoq.com/presentations/Sustainable-Test-Driven-Development

 

 

Sunday, May 09, 2010

Practical styles of pair programming

Excellent blog on pair programming.

http://blog.xebia.com/2010/05/09/practical-styles-of-pair-programming/

A quote from the blog entry:

"No you're not faster on your own, you're just creating more crap for your colleagues to puzzle over and eventually delete. The code you write alone sucks. That guy that is getting on your nerves is trying to tell you (clumsily) that your code sucks, try to listen to him and you'll turn into a better programmer." 

Have you encountered one or more of these styles?  How many developers are pair programming these days?