# Monday, August 11, 2003

I spent some time late last work working with my colleague Jeff Berkowitz on what seemed like a pretty sticky problem.  He’s got some more info about the problem here, but the quick description is that we were in a situation where Control.InvokeRequired was apparently returning an incorrect value.  Our code was clearly running on a thread other than the GUI thread, and yet InvokeRequired returned false.  Disconcerting to say the least. 

Jeff spent quite a bit of time tracking down the problem over the weekend, and the conclusion that he came to is that if you want to call Control.InvokeRequired and get a rational and deterministic answer, the control you are calling it on must be embedded in the control containment hierarchy all the way up to a top level Form.  What we were doing involved creating controls but holding them in memory without putting them into the “visual” hierarchy of the application, meaning that they were not contained by a parent Control or Form.  Apparently if that is the case, InvokeRequired doesn’t return the correct results.  (Note that so far this is based on experiential evidence and not “scientific” data.)

The longer I think about this the more I’m not surprised that this is the case, but I’ve never seen any hint of documentation (from MS or otherwise) that indicates that this is true.  The solution (at least for us) is pretty straightforward, and involved moving some initialization code until after all the controls have been fully created and sited in forms.  Not a big deal, but it does prevent us from doing some up front initialization in a place where our users wouldn’t notice it.  Now it’ll take a progress dialog.  Seems like a reasonable price to pay, but it would, of course, be nicer if it worked the way we expected it to.

Monday, August 11, 2003 4:05:21 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, August 05, 2003

So far I’ve done some experiments with a couple of the options I mentioned in my last post.   I tried styling the incoming XML document into something I could read into a dataset, and then used SqlDataAdapter.Update to persist the changes.  This works pretty well, but the biggest issue ends up being foreign key constraints.  I think you’d either have to do some pretty funky stuff in the stylesheet, or clean up the foreign keys once they were in the dataset, although that only works if your dataset doesn’t have constraints to begin with. 

Then I tried OPENXML, and I’ve got to say that so far that’s the way I’m leaning.  It turned out to make things much easier if I style the incoming XML into a simpler format (without all the namespaces) then pass that to OPENXML.  The OPENXML code turned out to be way less hairy than I had thought it might be, and I can handle the transaction in the stored proc rather than using DTC transactions.  All in all, not a bad thing.  It’s almost enough to make me not care if things change in Yukon in ways that would make this not work, or be obsolete.  It’s pretty slick in the near term.  I haven’t tried any performance testing, but it seems to me that the OPENXML solution is faster. 

I could try the other option of parsing the XML in C# and then making transacted ADO.NET calls to persist the data, but I don’t really want to go there.  It’s the business-layer XML parsing I’m trying to get rid of, and it’s a lot more code. 

Tuesday, August 05, 2003 7:34:08 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Wednesday, July 30, 2003

When new data comes in from monitors, it needs to be processed and then stored in the DB.  (See Xml on Large Power Transformers and Industrial Batteries for some background.)  I’m currently pondering how to manage that, and what I’m thinking is that I’ll process the incoming data as XML, using XPath and some strongly typed methods for doing inserts.  The issue I’m wrestling with is how to get the XML data into the database.  We’re currently using SQLXML for some things, so I could style it into an updategram.  Initially I was thinking of just chucking the whole incoming XML doc into a stored proc, and then using SQL Server 2K’s OPENXML method to read the document inside the stored procedure and do all the inserts from there.  The advantage is that I can rip up the doc and do a bunch of inserts into normalized tables and keep it all in the context of one transaction in the sproc.  Also, it keeps me from having to write C# code to parse out all the stuff that’s in the doc and do the inserts from there (although that’s also an option).  Usually I’m opposed to putting much code into the DB, since it’s the hardest bit to scale, but in this case it wouldn’t be business logic, just data parsing, which seems like a pretty reasonable thing for a database application to do. 

With that all out of the way, my concern is that OPENXML is fairly complex, and would require some relatively involved TSQL (which I can do, but would rather not, and hate to maintain).  Also, I worry about it all being made irrelevant by Yukon, which is supposed to have some pretty wizzy new XML related data storage stuff.  

Another option would be to style the incoming data into a dataset and use ADO.NET for the dirty work.  Since I’m really only ever doing inserts, not updates, it should be pretty straightforward. 

Sigh.  Too many choices.

Wednesday, July 30, 2003 7:27:59 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [2]  | 
# Friday, July 18, 2003

It’s hard to tell how much of this is an unfinished documentation issue, but in the WSE 2 Tech Preview, there doesn’t seem to be any easy way to do encryption using symmetric keys.  The sample code in the documentation is that same as in WSE1, and it doesn’t compile any more, since the version of the EncryptedData constructor that takes an EncryptionKey is obsoleted.  Now you have to pass a SecurityToken, but there isn’t one for symmetric keys.  This seems like a pretty straightforward thing to want to do, so I don’t really want to write a custom binary token to do it.  There are some hints in the documentation that there is a plan for a symmetric key token (or maybe I just haven’t figured it out yet).   Right now this is a total pain, since I have a bunch of existing WSE1 code that uses symmetric keys.  If nothing better happens by the time WSE 2 is released, I may have to write a custom binary token anyway.  It’s nice that they’ve put in so much work to make Kerberos and X509 easy, but since my client is an embedded device, those aren’t really viable choices. 


Friday, July 18, 2003 2:03:25 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, July 15, 2003

I know this is a bit after the fact, but I’ve had time to reflect.  Steve gave a very interesting talk on the feasibility of using SOAP/Web Services with embedded devices, which in his case was a printer.  I’m currently also working on an embedded Web Services project, so I was really interested in what Steve had to say.  His conclusion was that although there are some issues to be overcome, it is in fact feasible to use Web Services with an embedded device.  In the work I’ve done so far, I’d have to agree. 

The really interesting part for me what that the challenges Steve talked about with his printer were completely different ones from the challenges I’m facing talking to transformer monitors.   It sounded like the biggest hassle with using WS on the printer was dealing with large binary files as attachments (since it was actually a 3D printer, and 3D models can be very large).  There are several different ways of dealing with attachments to SOAP, and they all have their challenges. 

Our biggest challenge, on the other hand, is that since our monitors are meant to work over the WAN, we can’t assume connectivity to them all the time, which you probably can in a printer.  Also, it sounded like using Java on the printer wasn’t too big a problem, while we are working with straight C++ in a very constrained environment, for which there are many fewer tools available.

The short answer then to “is it feasible” is of course, “it depends”.  It worked for Steve.  So far it’s working for me.  I’ll keep my fingers crossed.

Tuesday, July 15, 2003 4:21:04 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [1]  | 
# Monday, July 14, 2003
A few hearty souls stuck it out until the bitter end to hear me ramble about XML on Large Power Transformers and Industrial Batteries. I got some good questions, and a couple of cool bits of feedback. I spent some time talking about the fact that one of the reasons we decided to go with XML Web Services in the embedded space was to take advantage of infrastructure implementations arising from GXA. For example, I don't have to wonder about how to deal with security of Web Services, because there's WS-Security, and better still in the future I'll be able to get a WS-Security appliance in the same way that I can get an SSL appliance today. Rich Salz, who gave the talk right before mine (Determining what Security your Web Services need) mentioned that his company makes such a thing today. Hooray! Very cool. With any luck, in the future I'll be able to get WS-Routing appliances also.
The other thing I talked about was the fact that our model is a little different from the usual Web Services models (Server to Server, or Smart Client) since we have to deal with the fact that our monitors aren't reachable (they have to call us) and that they can't maintain state about what data they've sent us. Given that fact that our monitors are meant to give utilities the information they need to keep their transformers from blowing up, Tim Ewald suggested that we call it the Exploding Client model. :)
Maybe it'll stick...
Monday, July 14, 2003 1:04:00 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [1]  | 
Aaron demonstrated some very cool techniques using XPath assertions to validate XML documents (specifically the contents of SOAP messages). He uses SOAP extensions and attributes that indicate which XPath assertions to apply to a given WebMethod. It apparently is much faster than doing full schema validation, and easier to set up, since most of the time you really only want to assert a few key requirements and don't need to validate the whole data structure exhaustively. I hadn't heard or Schematron before, but apparently is uses a similar technique using XPath assertions. There is a .NET implementation of Schematron here.
Monday, July 14, 2003 12:59:17 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Friday, July 11, 2003
I got a question yesterday about what was up with the formatting on my blog? How come it jumps around?
Well, with some help from Mike Amundsen at EraBlog.net where my blog is hosted, I'm now hosting that pages (but not the data) at my site. The permalinks on EraBlog now redirect to my site (http://www.cauldwell.net/patrick/blog) so if you use that link to get at my blog rather than /blogs/pcauldweblog the formatting will remain consistent.
For anyone interested in the implementation details, I'm using ASP.NET, using EraBlog's toolkit which makes SOAP requests back to EraBlog.net to get the data for my blog, then formatting it on my pages. Check this out for more details...
Friday, July 11, 2003 6:32:52 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
The sweet spot accoring to Tim:
Write schemas with global element declarations using anonymous types. To process them, wrap them in something that carries a DOM, exposes the Navigator interface, and give you type safe methods for modifying the documents.
This turns out to be very similar to some stuff I've been thinking about for processing/pipelining XML documents. More on this later...
Friday, July 11, 2003 5:48:39 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
Amazon gets Web Services. Jeff Barr is talking about how and why Amazon implements Web Services, and they really do it right. They have a real business model and Web Services (both SOAP and REST) that fit that business model and allow both Amazon and their associates to make money.
Their Web Service model is really just an extension of their earlier associate program, and basically just gives associates way more control over the look of their site and additional features they can support.
What I find really interesting is that there's a whole economy building up around this technology. There are even solutions available such as StorePerfect, which is a set of ASP.NET controls that wrap the Amazon web services, so all you have to do to set up a commerce site using Amazon is use these ASP.NET controls. So if you get a good domain name and do a little ASP.NET coding, you can set up a site that makes you money without having to handle any goods, process any payments, or ship anything. If that's not an appropriate use of Web Services, I don't know what is.
Friday, July 11, 2003 2:25:00 PM (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  |