# Tuesday, July 11, 2006

One of the things the has irked my about using SVN with VisualStudio.NET is trying to set up a new project.  You’ve got some new thing that you just cooked up, and now you need to get it into Subversion so it doesn’t get lost.  Unfortunatley that means you have to “import” it into SVN, being careful not to include any of the unversionable VS.NET junk files, then check it out again, probably some place else, since Tortoise doesn’t seem to dig checking out over the existing location.  Big pain.

Along comes Ankh to the rescue.  I’ve been using it off and on for a while (version .6 built from the source) but now I’m hooked.  It adds the traditional “Add to source control” menu item in VS.NET, and it totally did the right thing.  Imported, checked out to the same location (in place) and skipped all the junk files.  Worked like a charm.  I’m definitely a believer now.

Tuesday, July 11, 2006 10:46:49 AM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [3]  | 

I’m a big fan of watching TV shows after they come out on DVD.  You don’t have to deal with the commercials, and you can be assured of not missing anything.  Plus, I don’t have cable, so it’s about the only way I ever see TV.  Anyway, Vikki and I just finished season 1 of Veronica Mars.  What a fantastic show.  I can see why Joss Whedon calls it the best show that noone is watching.  Great dialogue, good acting (mostly), great story arc, and I totally didn’t see the ending coming. 

While each episode explores a subplot about the rigors of high school, etc. the overarching story line is about a murder mystery, and the season ends with the murderer revealed (it’s not who you think).  They pulled off some very interesting plot twists throughout.  I’m breathlessly anticipating season 2 next month.  There are still a number of open questions which I’m hoping they’ll pursue in the second season. 

If you like the Whedonverse (BtVS/Angel/Firefly) you’ll probably like Veronica Mars.  Best dialogue this side of Joss himself.

Tuesday, July 11, 2006 10:42:52 AM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, July 05, 2006

So at TechEd, Scott captured Jeff and I talking in the friendliest of fashions about the relative merits of Team System (which Jeff evangelizes) and the “OSS solution” which I helped craft at Corillian involving Subversion, CruiseControl.NET, NUnit, NAnt, et. al. 

Since then, I did a bit for PADNUG about the wonders of source control, and it caused me to refine my positions a bit. 

I think that in buying a system like Team System / TFS (or Rational’s suite, etc.) you are really paying for not just process, but process that can be enforced.  We have developed a number or processes around our “OSS” solution, including integrating Subversion with Rational ClearQuest so that we can relate change sets in SVN with issues tracked in ClearQuest, and a similar integration with our Scrum project management tool, VersionOne.  However, those are policies which are based upon convention, and which we thus can’t enforce.  For example, by convention, we create task branches in SVN named for ClearQuest issues (e.g. cq110011 for a task branch to resolve ClearQuest issue #110011), and we use a similar convention to identify backlog items or tasks in VersionOne.  The rub is that the system depends upon developers doing the right thing.  And sometimes they don’t. 

With an integrated product suite like TFS, you not only get processes, but you get the means to enforce them.  In a previous job, we used ClearQuest and ClearCase together, and no developer could check in a change set without documenting it in ClearQuest.  Period.  It was not left up to the developer to do the right thing, because the tools made sure that they did.  Annoying?  Sure.  Effective?  Absolutely.  Everyone resented the processes until the first time we really needed the information about a change set, which we already had waiting for us in ClearQuest. 

Is that level of integration necessary?  We’ve decided (at least for now) that it’s not, and that we are willing to rely on devs doing the right thing.  You may decide that you do want that level of assurance that your corporate development processes are being followed.  All it takes is money. 

What that means (to me at least) is that the big win in choosing an integrated tool is the integration part.  Is the source control tool that comes with TFS a good one?  I haven’t used it personally, but I’m sure that it is.  Is it worth the price if all you’re looking for is a source control system?  Not in my opinion.  You can get equally capable SCC packages for far less (out of pocket) cost.  It’s worth spending the money if you are going to take advantage of the integration across the suite, since it allows you to not only set, but enforce policy. 

I’m sure that if you choose to purchase just the SCC part of TFS, or just Rational’s ClearQuest, you’ll end up with a great source control tool.  But you could get an equally great source control tool for a lot less money if that’s the only part you are interested in. 

The other thing to keep in mind is that the integrated suites tend to come with some administration burden.  Again, I can’t speak from experience about TFS, but in my prior experience with Rational, it took a full-time adminstrator to keep the tools running properly and to make sure they were configured correctly.  When the company faced some layoffs and we lost our Rational administrator, we switched overnight to using CVS instead, because we couldn’t afford to eat the overhead of maintaning the ClearQuest/ClearCase tools, and none of us had been through the training in any case.  I’ve heard reports that TFS is much easier to administer, but make sure you plan for the fact that it’s still non-zero.

So, in summary, if you already have a process that works for you, you probably don’t need to invest is a big integrated tools suite.  If you don’t have a process in place (or at least enough process in place) or you find that you are having a hard time getting developers to comply, then it may well be worth the money and the administrative overhead.

Wednesday, July 05, 2006 3:54:38 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [2]  | 
# Tuesday, June 27, 2006

I took Steve’s comment to heart, and got rid of the two places I had been “forced” to use the CodeSnippetExpression.  It took a few minutes thought, and a minor refactoring, but I’ll sleep that much better at night. 

Vive le DOM!

Tuesday, June 27, 2006 1:10:37 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, June 26, 2006

A few days back, Jeff commented that he wasn’t convinced about the value of putting an object model on top of what should be simple string generation.  His example was using XmlTextWriter instead of just building an XML snippet using string.Format. 

I got similar feedback internally last week when I walked through some of my recent CodeDOM code.  I was asked why I would write

CodeExpression append = new CodeMethodInvokeExpression(sb,"Append", new CodePrimitiveExpression(delim1));

instead of

CodeExpression append = new CodeSnippetExpression(“sb.Append(delim1)”);

It’s a perfectly reasonable questions.  I’m using the CodeDOM’s object model, but the reality is since we’re an all-C# shop, I’m never going to output my CodeDOM-generated code as VB.NET or J#.  So I could just as easily using a StringBuilder, and a bunch of string.Format calls to write C# code and compile it using the CodeDOM’s C# compiler.  It certainly would be simpler. 

The same is true (as Jeff points out) for XML.  It’s much easier to write XML using string.Format, or a StringBuilder. 

If nothing else, I personally find that using the object-based interface makes me think harder about the structure I’m creating.  It’s not really much less error-prone that writing the code (or XML) by hand, it just provides a different way to screw up.  What it does do is force you to think at a higher level of abstraction, about the structure of the thing you are creating rather than the implementation.  You may never need to output binary XML instead of text, but using XmlTextWriter brings you face to face with the nauances of the structure of the document you’re creating.  Writing a CDATA section isn’t the same as writing an element node.  And it shouldn’t be.  Using the object interface makes those distinctions more obvious to the coder. 

However, it’s definitely a tradeoff.  You have to put up with a lot more complexity, and more limitations.  There are a buch of contructs that would be much easier to write in straight C# than to express them in CodeDOM.  It’s easy to write a lock{} construct in C#, but much more complex to create the necessary try/catch and monitor object using the CodeDOM. 

I was, in fact, forced to resort to the CodeSnippetExpression in one place, where nothing but a terniary operator would do.  I still feel guilty.  :-)  Maybe it just comes down to personal preference, but I’d rather deal with the structure than the syntax even if it means I have to write more complicated code.

Monday, June 26, 2006 2:05:19 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [1]  | 

Aboutt 4 months ago I moved into a brand new (town)house.  It’s been great, particularly since our last house was generating more maintenance opportunities that we could handle.  The new place is 3 stories, and there’s a deck off the back of the second floor over the driveway.  Staining/finishing said deck is left as an excercise for the homeowner, and yesterday I finally got around to it.  At first I didn’t want to tackle it due to the everpresent rain, and lately it’s just been a matter of finding the time.  And I really hate ladders. 

Anyway, I had the time, the materials, and no rain.  Unfortunately, it was around 100° yesterday.  It’s a small deck, but nonetheless 4 hours of huffing paint fumes on my hands and knees left me a bit nackered.  And today I’m finding out how unbendy I’ve become (i.e. crippled today). 

This whole getting older thing really blows. 

Monday, June 26, 2006 1:14:20 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [2]  | 
# Thursday, June 22, 2006

I’m a huge fan or runtime code generation.  I think it solves a lot of common problems in modern applications.  However, the .NET implementation of runtime code gen, a.k.a. the CodeDOM, is viciously difficult to learn and use.  I think that’s holding many people back from implementing good solutions using runtime codegen. 

For example, it takes all this

CodeConditionStatement ifArrayNotNull = new CodeConditionStatement(

    new CodeBinaryOperatorExpression(propertyRef,

    CodeBinaryOperatorType.IdentityInequality,

    new CodePrimitiveExpression(null))

    );

CodeMethodInvokeExpression convertExpr = new CodeMethodInvokeExpression(

    new CodeTypeReferenceExpression(typeof(Convert)),"ToBase64String",

    new CodeExpression[] { new CodeCastExpression(typeof(byte[]),propertyRef)}

    );

ifArrayNotNull.TrueStatements.Add(convertExpr);

to make this

if(myObject.Property != null)

{

    Convert.ToBase64String(myObject.Property);

}

Not only is it a lot more code using the CodeDOM, but it’s certainly not the kind of code that you can just pick up and understand. 

There must be an easier way.  Refly helps a bunch, but last time I tried it I found it to be incomplete.  It’s certainly a step in the right direction.  It’s certainly a model that’s much easier to understand.  I wonder if, in the end, it’s too limiting?  There may be a good reason for the complexity of the CodeDOM.  Or there may not be.

Thursday, June 22, 2006 4:32:32 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [2]  | 

Next Wednesday evening, I’ll be talking about the wonders of source code control at this month’s PADNUG meeting.  If you’re currently using SCC, you might get some tips for making better use of it.  If you aren’t, you’ll find out why you should be, even if you work alone.  I’ll be focusing on the role of SCC in agile development and continuous integration.  I’ll also talk about some of the many version control systems available, and which one(s) may be right for your environment (as well as which ones you really shouln’t be using:  you know who you are…). 

And there’s pizza!

Thursday, June 22, 2006 3:33:23 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
I hope not.  But since it seems to otherwise be a good recording, I fear it’s so.  Check out Hanselminutes 21, wherein Scott interivews Jeff Atwood and I about the relative merits of Subversion vs. Team System.  Taking the show on the road was a great idea, and all the segments came out very well.  I’m amazed that he and Carl were able to clean up the audio to the point that you hardly hear the other 7–8000 people in the room with us.
Home | TechEd | Work
Thursday, June 22, 2006 3:28:15 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [1]  | 
# Monday, June 19, 2006

On Saturday, Vikki and I participated in this year’s NW Emergency Response Team Rodeo, and had a blast!  This event is for ERT groups from NW Oregen/ SW Washington to meet each other and practice their skills.  We got assigned to mixed-jurisdictional teams, each team about 6–8 people, including one radio operator.  Each team then rotated through 10 stations where we got to practice triage, cribbing and extrication, first aid, victim transport, fire supression, SAR, etc.  Big fun. 

This year it was hosted at TVF&R’s training facility, so we got to put our car fires, search a 5 story building, look for victims in the dark, and extricate victims from under actual rubble.  Best of all, we had instructors from fire departments all over the area who were coaching and teaching us (most of them on their own time, thanks guys!).  Disneyland for safety nerds!

We’re actually running another rodeo this year, in September, so if you’re on a local CERT team, come out and play.  If you aren’t, go get trained

Monday, June 19, 2006 11:33:51 AM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  |