Sometimes, you write something that you have to keep a lid on for a little while, because you don't want to offend someone who might be reading. This is one of those cases.
A fellow co-worker and I found ourselves in a client site who was a big believer in my most favorite of dysfunctional of methodologies, Waterfall. We were hired to help them design an advanced new system (that, of course, interacted with their still very active Mainframe development) with innovative technology. My coworker and I showed up with great enthusiasm (because we didn't know what was going on yet).
Here is what we were tasked to create: First, we need a Functional Design document (a technical-ish set of requirements) to pass off to the business unit to get approval for the project (which was already approved -- that's why we were there). It took man-weeks of effort to produce this utterly useless document, which was created by emailing Word documents around, because they don't put design documents in version control (What? Did I hear that right? Why not? "We just don't" “No, why?” “If it’s in version control, others can see it”). OK.
Next comes the Technical Design Document, another multi-man-week project, where we take the Functional Design and produce a Technical Design, meaning that we finally get to reveal all the technical decisions we made while writing the Functional Design but couldn't put in that document. And we can't produce "prototypes", because that's a bad word -- business people want to see prototypes run. "Can we call them spikes? Sure, no one knows that that means". A few times, I thought my coworker's head was actually going to explode. We set up our own secret version control system and starting writing our design documents (oh, sorry, Design Documents) using XML so that we could diff them, and converted them to Word documents when the big flurry of Consolidating the Documents happened.
Then we made a decision. Given the amount of time that we had to create the Technical Design Document, we could just write the code, using XP techniques (test-first coding, iterative design, etc.) and produce the design document based on the working, tested code. And that's what we did with part of the system. We kept this under wraps (after all, we're supposed to be designing, not coding) and made good progress. To show the quality of the code, I implemented code coverage using Cobertura so that we could demonstrate that the unit tests were actually exercising the code. With just a couple of days left in the Technical Design phase, we were asked to provide estimtes as to how long it would take to implement our design. That's when we let the cat out of the bag: "Do you want us to tell you how long it would take, or how long we have left? We can give you really good estimates because it's already done". We showed them the implementation and the unit tests, along with code coverage (96% code coverage, 100% branch coverage) . The Technical Design document that was already based on working code.
They weren't upset, but weren't particularly happy either -- just kind of stunned. This way of writing software is so foreign that they have no perspective on it. They've never seen code with unit tests, or code coverage, nor any code that was produced in an iterative way. You are supposed to spend months on design documents, then write the code, right? How is it that you can not do all that? Instead of the elation that we expected, we just got a weak "Good job, guys", and they took their working code and incorporated it where it was supposed to go. We were finished with our tasks, so we left.
But not without planting some seeds. You see, we involved some of their other programmers in what we were doing. We didn't code with them, but explained how we were going about it. And we showed them unit tests (which they had heard of) and code coverage (which was new). The other developers clearly saw the benefit of what we had done. Sometimes, you can't dig a deep hole all at once because you have to move too much dirt. But a river can dig a deep hole a little at a time, eroding away grime and stone bit by bit. Even if we didn't dig a deep hole, maybe we created a rivulet.
Monday, September 19, 2005
Thursday, September 15, 2005
Technology Snake Oil, Part 5: The Integration Myth
Integration is a tricky thing. When it works flawlessly, it is a huge productivity boon. But when you trust it and it lets you down, it is a huge time sink. Some examples: I used to work with someone who had a genuine fetish for tool integration. He wasted more time trying to get tools to work together than he did performing useful work. At the time, we were using JBuilder and Visual Source Safe (for my opinions on this piece of technology, check out VSS: Unsafe at Any Speed). The integration between these two tools was spotty at best. I never even tried – it was easy enough to keep the VSS explorer open alongside JBuilder and just bounce between them. But not Mr. Integration. He tried a variety of supposed integration plug-ins, each with serious shortcomings. Finally, exasperated, I told him that I had the perfect integration strategy. “I’ve fixed your JBuilder/VSS integration. First, open both of them. Second, I’ve installed a special hot-key in your copy of JBuilder, Alt-Tab. When you hit it, the VSS explorer pops up, and it works just like the explorer that VSS itself uses – it’s indistinguishable!”
Recently, another example popped up from an unexpected place. It turns out that there is a bug in my beloved IntelliJ and WebLogic, but only when you are doing distributed transactions and messaging. My colleague spent the better part of a day trying to resolve that little nasty bug.
The snake oil hiding in the shadows here is the promise of integration sold by tool vendors. IDE’s try to encompass more and more. Visual Studio has always been successful with this (especially if you use all Microsoft technologies) without achieving a truly great code experience. Visual Studio.NET 2003 is a third world country compared the scarily intuitive IntelliJ, which is the Rolls Royce of IDEs. VS.NET 2005 is better, but still not up to IntelliJ’s standard. Borland tried to take this to the extreme conclusion with its Software Delivery Optimization suite, which tried to bind together requirements gathering, version control, coding, deployment, and monitoring into a single environment. This vision yields an awesome productivity gain as long as the integration is flawless. However, integration at that level is never flawless, meaning that you spend as much time (and considerably more frustration) trying to figure out why something that should be working isn’t, only to find out that it actually is working but the integration is obscuring the results.
Some integration is good – that’s why we have integrated development environments. However, there is a fine line where vendors try to go too far and end up lessening productivity rather than enhancing it. The problem is that tool vendors are trying to create monolithic environments. I suspect that Visual Studio in all its incarnations may be the culprit here – everyone is trying to replicate that environment. Finding the fine line between integration vs. stand-alone tools is tough and going to get tougher, as vendors produce more and more immersive environments to get you to buy into their integration strategy.
Someone has done this exactly right. In the next Technology Snake Oil (Drowning in Tools), I’ll talk about that.
Recently, another example popped up from an unexpected place. It turns out that there is a bug in my beloved IntelliJ and WebLogic, but only when you are doing distributed transactions and messaging. My colleague spent the better part of a day trying to resolve that little nasty bug.
The snake oil hiding in the shadows here is the promise of integration sold by tool vendors. IDE’s try to encompass more and more. Visual Studio has always been successful with this (especially if you use all Microsoft technologies) without achieving a truly great code experience. Visual Studio.NET 2003 is a third world country compared the scarily intuitive IntelliJ, which is the Rolls Royce of IDEs. VS.NET 2005 is better, but still not up to IntelliJ’s standard. Borland tried to take this to the extreme conclusion with its Software Delivery Optimization suite, which tried to bind together requirements gathering, version control, coding, deployment, and monitoring into a single environment. This vision yields an awesome productivity gain as long as the integration is flawless. However, integration at that level is never flawless, meaning that you spend as much time (and considerably more frustration) trying to figure out why something that should be working isn’t, only to find out that it actually is working but the integration is obscuring the results.
Some integration is good – that’s why we have integrated development environments. However, there is a fine line where vendors try to go too far and end up lessening productivity rather than enhancing it. The problem is that tool vendors are trying to create monolithic environments. I suspect that Visual Studio in all its incarnations may be the culprit here – everyone is trying to replicate that environment. Finding the fine line between integration vs. stand-alone tools is tough and going to get tougher, as vendors produce more and more immersive environments to get you to buy into their integration strategy.
Someone has done this exactly right. In the next Technology Snake Oil (Drowning in Tools), I’ll talk about that.
Subscribe to:
Posts (Atom)