# Wednesday, July 30, 2003

When new data comes in from monitors, it needs to be processed and then stored in the DB.  (See Xml on Large Power Transformers and Industrial Batteries for some background.)  I’m currently pondering how to manage that, and what I’m thinking is that I’ll process the incoming data as XML, using XPath and some strongly typed methods for doing inserts.  The issue I’m wrestling with is how to get the XML data into the database.  We’re currently using SQLXML for some things, so I could style it into an updategram.  Initially I was thinking of just chucking the whole incoming XML doc into a stored proc, and then using SQL Server 2K’s OPENXML method to read the document inside the stored procedure and do all the inserts from there.  The advantage is that I can rip up the doc and do a bunch of inserts into normalized tables and keep it all in the context of one transaction in the sproc.  Also, it keeps me from having to write C# code to parse out all the stuff that’s in the doc and do the inserts from there (although that’s also an option).  Usually I’m opposed to putting much code into the DB, since it’s the hardest bit to scale, but in this case it wouldn’t be business logic, just data parsing, which seems like a pretty reasonable thing for a database application to do. 

With that all out of the way, my concern is that OPENXML is fairly complex, and would require some relatively involved TSQL (which I can do, but would rather not, and hate to maintain).  Also, I worry about it all being made irrelevant by Yukon, which is supposed to have some pretty wizzy new XML related data storage stuff.  

Another option would be to style the incoming data into a dataset and use ADO.NET for the dirty work.  Since I’m really only ever doing inserts, not updates, it should be pretty straightforward. 

Sigh.  Too many choices.