# Monday, 06 December 2004
I’ll be teaching again next term at OIT (at CAPITAL Center in Beaverton), this time “Enterprise Web Services”.  We’ll be looking at what it takes to build a real-world enterprise application using web services, including such topics as asynchronous messaging, security, reliable messaging and a host of others. We’ll walk through all the stages of building an enterprise-level WS application, using .NET and WSE 2.0 to do the heavy lifting.  Required is a firm grasp of programming in C#, and a basic understanding of Web Services fundamentals such as XML, SOAP, and WSDL.
Monday, 06 December 2004 13:18:30 (Pacific Standard Time, UTC-08:00)  #    Disclaimer  |  Comments [0]  | 

I just finished Neal Stephenson’s Cryptonomicon, and what a ride it was!  Snow Crash is one of my favorite books ever, but I was pretty underwhelmed by The Diamond Age, so I’ve put off reading Cryptonomicon.  I now see that was a mistake.  I enjoyed the whole book, and couldn’t put it down for the last 100 or so pages. 

Stephenson employs the fabulously wacky use of language that he did in Snow Crash, but along slightly less absurdist lines.  Snow Crash was cool, but definitely felt fictional (although not that far off from where we are today, frankly).  Cryptonomicon, on the other hand, is a bit more down to earth in it’s subject matter, even though it’s certainly fanciful.

I thought he did a great job of capturing modern nerd culture, and was gratified to see that Randy the geek turns out to be the hero of the tale in the end.  A great read.  Now I’ll have to start the Baroque Cycle…

Monday, 06 December 2004 13:04:38 (Pacific Standard Time, UTC-08:00)  #    Disclaimer  |  Comments [0]  | 
Check out the ever-well-informed-and-entertaining Stuart Celarier this Thursday at CAPITAL center in Beaverton.  Should be a good talk.  If you ask nicely he might even juggle.  :-)
Monday, 06 December 2004 10:38:27 (Pacific Standard Time, UTC-08:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, 30 November 2004

We’ve recently switched to the latest version of CruiseControl.NET (0.7) and my favorite new feature is the ability of ccnet to deal with CVS directly.  Previously we had to include code in our NAnt build file to do a CVS update at the beginning of the build, then do the CVS tag (so we can tag all the files with the build version) at the end of the build if it was successful. 

The new ccnet will do the update and the label for us, but…

It only supports one format for the labels, which it to allow you to specify a prefix, like “1.0.” and it will increment a number and append it, so you get “ver-1.0.1”, “ver-1.0.2”, etc.  That number resets to 1 every time you restart the ccnet executable.  Hmmm.  What we wanted was to use our previous scheme, which involved the version number we use for our .NET executables (e.g. 1.0.222.3333).  We used the version task from NAntContrib to create that version number on the formula (x.y.monthday.secondssincemidnight). 

Luckily, ccnet .7 provides an interface for the labeling, so you can write your own scheme.  Ours now looks like this…

    [ReflectorType("ourlabeller")]

    public class OurLabeller : ILabeller

    {

        public OurLabeller()

        {

        }

 

        private string majorMinor = "2.0";

        [ReflectorProperty("majorminor", Required=false)]

        public string MajorMinor

        {

            get

            {

                return majorMinor;

            }

            set

            {

                majorMinor = value;

            }

        }

        #region ILabeller Members

 

        public string Generate(IIntegrationResult previousLabel)

        {

            string ver = string.Format("{0}.{1}.{2}",majorMinor,getMonthDay(),calculateSecondsSinceMidnight());

            return ver.Replace(".","_");//it's a label, so no dots...

        }

 

        #endregion

 

        #region ITask Members

 

        public void Run(IIntegrationResult result)

        {

            result.Label = Generate(result);

        }

 

        #endregion

        private int calculateSecondsSinceMidnight()

        {

            DateTime today = DateTime.Now;

            return (today.Hour * 3600 + today.Minute * 60 + today.Second) / 10;

        }

 

        public int getMonthDay()

        {

            DateTime time = DateTime.Now;

            string timeString = string.Format("{0}{1}",time.Month,time.Day);

            return Convert.ToInt32(timeString);

        }

 

    }

So now ccnet will now use our labeling scheme, as long was we stick our new class in an assembly called ccnet.*.plugin.dll.  The config file bit looks like

  <labeller type="ourlabeller">
    <majorminor>2.0</majorminor>
  </labeller> 

We want the version of the assemblies to match the new generated label, so we need to read it in our NAnt buildfile.  CCNET stuffs the label in a property that gets passed to NAnt called ccnet.label, so we can read that in our NAnt build…

  <if propertyexists="ccnet.label">
   <script language="C#">
    <code><![CDATA[
    public static void ScriptMain(Project project) {
     //Shorten the project string (like 1.3.4.5, to 1.3.4)
     string projectVersion = project.Properties["ccnet.label"];
     project.Properties["project.version"] = projectVersion.Replace("_",".");
    }
   ]]></code>
   </script>
  </if>

Tuesday, 30 November 2004 16:22:47 (Pacific Standard Time, UTC-08:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, 09 November 2004

The Hollywood Foreign Press Association has announced that it won't consider Michael Moore's Fahrenheit 9/11 for any Golden Globe awards because they don't give awards for "documentaries". 

Apparently they haven't actually watched the film.  (I finally saw it last week.)  Whether or not you agree with Moore, you can hardly call F 9/11 a "documentary".  It's clearly political theater, and anyone who claims it's a documentary is missing the point.  A lot of criticism that came out against the film centered on the fact that it was biased and didn't give both sides equal time.  Of course it was biased.  It's theater.  Documentarians don't pull stunts like Moore does.  Political satirists do. 

Anyway, agree with Moore or not, I think it's a bit disingenuous of the HFPA to claim that they won't consider it because it's a "documentary".

Tuesday, 09 November 2004 15:32:13 (Pacific Standard Time, UTC-08:00)  #    Disclaimer  |  Comments [2]  | 
# Monday, 08 November 2004

Our CTO, Chris, recently turned me on to Ruby.  I've been playing around with it a bit over the last few weeks, and I've got so say I'm pretty impressed.  I really appreciate that it was designed, as they say, according to the “Principal of Least Surprise”.  Which means that it basically works the way you would think. 

Ruby has a lot in common with Smalltalk, in that “everything is an object” kinda way, but since Ruby's syntax seems more (to me at least) like Python or Boo, it seems more natural than Smalltalk.  Sure, you don't get the wizzy browser, but that's pretty much OK.  When you apply the idea that everything is an object, and you're just sending them messages to ask them (please) to do what you want, you get some amazingly flexible code.  Sure, it's a bit squishy, and for code I was going to put into production I still like compile time type safety, but for scripting or quick tasks, Ruby seems like a very productive way to go.

Possibly more impressive was the fact that the Ruby installer for Windows set up everything exactly the way I would have thought (”least surprise” again) including adding the ruby interpreter into the path (kudos) and setting up the right file extension associations so that everything “just worked”.  Very nice.

The reason Chris actually brought it to my attention was to point me at Rails, which is a very impressive MVC framework for writing web applications in Ruby.  Because Ruby is so squishily late-bound, it can do some really amazing things with database accessors.  Check out the “ActiveRecord” in Rails for some really neat DAL ideas. 

I'm assuming that that same flexibility makes for some pretty groovy Web Services clients, but I haven't had a chance to check any out yet.  Anyone have any experience with SOAP and Ruby?

Monday, 08 November 2004 18:48:14 (Pacific Standard Time, UTC-08:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, 02 November 2004
Just in case you somehow missed the fact that it's election day, Go Vote!  If you don't vote now, you don't get to bitch later. :-)
Tuesday, 02 November 2004 09:43:40 (Pacific Standard Time, UTC-08:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, 29 October 2004

The last working day before Halloween is upon us (as evidenced by the giant Sponge Bob sitting over in QA) and Drew brings us a poem of distrubed systems horrors.  A great read!

Friday, 29 October 2004 09:44:02 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, 27 October 2004

So, one the things we do in CERT (Community Emergency Response Team) training is simulated disaster exercises.  One of the things you need for simulated disasters is simulated victims.  What you want from a good simulated victim is something that looks realistic, and preferably really gross.  If you get used to dealing with simulated grossness, the theory is it will be easier to deal with actually grossness when it shows up. 

Enter "moulage".  (I'd never heard that word either.)

"Moulage" is the art of making people up to look like they have horrible disfiguring wounds that will be noticed by people in medical training, and dealt with accordingly.  Last night I went to a moulage class put on for our local CERT team, and it was really a lot of fun, in a gross, messy kind of way.  I hope to have pictures later. 

We spent over an hour looking at a combination of moulaged volunteers and actual accident victims to get a feel for what we were trying to achieve.  Yuck.  People can get hurt in really messy ways.  Then we set about making each other look gross, which was much more fun. :-)

My favorite ones were the burns.  It turns out to be really easy to make some really nasty and convincing looking third degree burns (using common household items).  I had a harder time with the "impaled objects" and compound fractures.  Mostly because I was trying to work on myself.  It's way easier to get a chicken bone to stick out of someone else's arm than your own.  And it takes a lot more artistic skill than the burns to get the colors right.  Ah, well.  Now I have an excuse to dress my kids up as accident victims for Halloween.  I need to practice.

Wednesday, 27 October 2004 15:16:34 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, 26 October 2004

As most of you probably have already heard, according to Dare, we won't be getting XQuery with Whidbey. 

LAME!

One of the reasons given for this decision is that customers want something that is compliant with W3 standards.  OK, that's true.  I would disagree that people will only use something that is compliant with a full recommendation.  Back in the day when MS first started putting out XML tools (early MSXML, SOAP Toolkit, etc.) many of those tools were built around working drafts, and we still managed to use them to get real work done.  I would argue that even if the XQuery spec were to change substantively between now and it's full recommendation-hood (which I doubt) there's plenty of opportunity to get real work done with XQuery starting right now. 

The counter argument is that people don't want to make changes to their code when the real spec ships.  Guess what!  There have been breaking changes introduced in each new revision of the .NET framework.  People have to change their code all the time.  I had to unwind a huge number of problems do to the changes in remoting security between .NET 1.0 and 1.1.  Somehow we manage.  The excuse of "well, you still have XSLT" just doesn't cut it IMHO.  XSLT is a much more difficult programming model than XQuery, and most people to this day don't get how the declarative model in XSLT is supposed to work.  XPath 1.0 is very limiting, which is why there's an XPath 2/XSLT 2 (which also are not going to be supported in Whidbey!). 

I have to wonder if performance issues aren't closer to the truth of why it's not shipping.  Writing an engine that can do arbitrary XQuery against arbitrary documents certainly isn't an easy thing to do.  Think SQL.  SQL Server is a non-trivial implementation, and there's a reason for it.  I'm guessing that the reality of trying to make XQuery perform the way people would expect is a pretty daunting task. 

Either way, I think it's a drag that we won't get XQuery support, recommendation or no.

XML
Tuesday, 26 October 2004 11:00:54 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  |