An Old Problem is New Again

I ran into a problem yesterday that brought back memories. And not the good kind of memories either.

I'm doing some work using TypeConverters. In particular, I'm creating a Windows Forms control that functions in a manner similar to a PropertyGrid, but with a user interface that looks more like a typical Windows form. You know, with labels, and text boxes. Because this is a generic control (that is, the fields which get displayed depends on the object associated with the control), I'm doing a lot of reflection to retrieve the properties that are required. And I'm using reflection (the GetValue/SetValue) to interact with the object. This last part (the SetValue method in particular) means that I'm using type converters to move the string values that are in the text boxes into the actual properties themselves.

Now type converters are quite useful. They expose a ConvertFromString method that (not surprisingly) takes a string value and converts it to a value of the associated type. If the string is not valid for the destination type, an exception is thrown.

Problem 1: the raised exception is System.Exception. I have no idea why the exception surfaced from ConvertFromString is an Exception. The InnerException is a FormatException, which is the type of exception that I would have expected. But no, ConvertFromString throws Exception. This forces me to use a pattern that I loathe - try {...} catch (Exception) { }

Whenever I teach, I always harp on this. You should (almost) never catch the general Exception object. You should only catch specific Exceptions. But because of ConvertFromString, I have to add to my list of cases where catch (Exception) is required.

But why should this matter, you ask. TypeConverters exposes an IsValid method. You should be able to check for the validity of the string value prior to calling ConvertFromString. This would completely eliminate the need to catch any exceptions at all. And it's the best practice/recommended/ideal way to go.

Problem 2: Int32Converter.IsValid("bad") returns true.

Think about that last one for a second.

According to the type converter for Int32, the string 'bad' is a valid integer value. In my world, not so much. If you spelunk the various classes in .NET Framework, you find out that IsValid is supposed to be overridden in the various converter classes. But the numeric classes don't bother to do so. As a result, the IsValid that actually services my request just returns a true regardless of whether the string is valid or not.

What's worse, this is a known bug that will never be resolved. Apparently to make IsValid work for Int32Converter is considered a breaking change.

So my suggestion (courtesy of Dave Lloyd) is to add a new method to the TypeConverters. Call it IsReallyValid. IsReallyValid could then be implemented properly without the breaking change. It could take the CultureInfo object necessary to truly determine whether a string is valid. For the base type converter, it could simply return a true.

Heck, let's go a step further and mark IsValid as being Obsolete. That would force people to move to IsReallyValid and correct any problems in their applications. And maybe, in a couple of versions, Microsoft could reintroduce a working version of IsValid and mark IsReallyValid as being obsolete. In this way, four versions from now (that's about 10 developer years), IsValid will work the way it's supposed to (and everyone would expect it to).

BizTalk -> Custom Pipeline Component for Processing DBF, Excel and other ODBC types

I deleted the original post by mistake (woops!), so below is to replace the deleted one.

Last year a customer had a requirement to process DBF files in BizTalk. I created a custom pipeline component that saved the incoming binary stream to a physical file on the BizTalk machine and then used basic ADO.NET to parse the DBF File into an XML document. I then modified/extended this pipeline component to accept and parse other ODBC files to XML, such as:

DBF
Excel
FoxPro
Possibly others such as Access Files.

At this point in time, this custom pipeline component will only parse Excel and DBF files, but it is possible to modify the component to process other ODBC types.

By using this custom pipeline component in a BizTalk Receive Pipeline it will do the following:

Raw DBF, Excel messages are delivered to BizTalk by any transport such as:
File
FTP
MSMQ
etc. etc.

The raw message will be parsed to XML in a BizTalk Receive Pipeline with the parsed XML message published into the MsgBox.

This component requires no special APIs and uses basic ADO.NET code to parse the ODBC type files into XML.

You can download the full source code for the Custom Pipeline component at the end of this entry.

The component works as below:

1) The incoming file is saved to a temporary file on the BizTalk machine.
2) An OLEDB connection will be used to connect to the file from 1).
3) A Sql query is performed against the OLEDB datasource.
4) The results from the query are stored to an ADO.NET dataset/datatable.
5) The XML is extracted from the datatable and modified for a root node name and target namespace.
6) The temporary file from 1) is deleted
7) The XML from 5) is added back to the pipeline message stream.


The custom pipeline component was coded as a Decoder pipeline component, but it could be modified to implement a Disassembler pipeline component.

The Custom Pipeline Component exposes a number of properties for dynamic configuration.

The connection string and query differs slightly for an Excel and DBF file. Therefore the configuration for an Excel file and DBF file are discussed separately:

Excel

The incoming Excel file to be parsed looks as below:

The resultant parsed XML file will look as below:

Note: Only two Employee nodes are present in the XML file due to a filter condition in the configuration (see below).

The Configuration for this Pipeline is as below:

1) ConnectionString -> The OLEDB Connection string for the Excel file.
The following is set for the ConnectionString property:
Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties=Excel 8.0;
But, the final Connection String that is produced by the code looks like below:
Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties=Excel 8.0;Data Source=C:\Temp\afgd1234.xls

This is because the code, dumps the Excel File to the TempDropFolderLocation and must dynamically add the Data Source section to the connection string.

Note : Other Connection Properties for an Excel File:

"HDR=Yes;" indicates that the first row contains columnnames, not data
"IMEX=1;" tells the driver to always read "intermixed" data columns as text
(Above From: http://www.connectionstrings.com/ )

2) DataNodeName -> The XML Node name for the Data. In this case Employee

3) DeleteTempMessages -> If set to True, will delete the Excel file that is dropped to the TempDropFolderLocation after processing.

4) Filter -> Filter for the SqlStatement. In this case, will only Select LastNames Like %B%
Note: This is optional. If all data is to be returned, leave blank.

5) Namespace -> NameSpace for the resultant XML message.

6) RootNodeName -> Root Node Name for the resultant XML Message.

7) SqlStatement -> OLEDB Select Statement.
SQL syntax: SELECT * FROM [sheet1$] - i.e. worksheet name followed by a "$" and wrapped in "[" "]" brackets.
(Above From: http://www.connectionstrings.com/ )

Note: The SqlStatement could also look as below:
Select FirstName,LastName FROM [sheet1$]  (only bring back selected columns)
Select FirstName as FName, LastName as LName FROM [sheet1$] (rename the column Names in the resultant XML)

8) TypeToProcess -> In this case Excel File.

DBF

The incoming DBF file to be parsed looks as below:

The resultant parsed XML file will look as below:

Note: Only two Items nodes are present in the XML file due to a filter condition in the configuration (see below).

The Configuration for this Pipeline is as below:

Note: The above is an example of Per Instance Pipeline Configuration for BizTalk 2006.

1) ConnectionString -> The OLEDB Connection string for the DBF file.
The following is set for the ConnectionString property:
Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties=dBASE IV;
But, the final Connection String that is produced by the code looks like below:
Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties=dBASE IV;Data Source=C:\Temp\

This is because the code, dumps the DBF File to the TempDropFolderLocation and must dynamically add the Data Source section to the connection string.

2) DataNodeName -> The XML Node name for the Data. In this case Items

3) DeleteTempMessages -> If set to True, will delete the DBF file that is dropped to the TempDropFolderLocation after processing.

4) Filter -> Filter for the SqlStatement. In this case, will only Select PRICE >= 200 and PRICE <=500
Note: This is optional. If all data is to be returned, leave blank.

5) Namespace -> NameSpace for the resultant XML message.

6) RootNodeName -> Root Node Name for the resultant XML Message.

7) SqlStatement -> OLEDB Select Statement.

In this case only have the columns part of the Select Statement as below:
Select * 

This is because the code dumps the DBF File to the TempDropFolderLocation and must dynamically add the FROM statement as below:
SELECT * FROM i0lb1gcr.dbf
 
Note: The SqlStatement could also look as below:
Select COD, PRICE (only bring back selected columns)
Select COD as Id, Price as Amount (rename the Node Names in the resultant XML)

8) TypeToProcess -> In this case DBF File.
Note: When configuring a Pipeline Component in the BizTalk Server 2006 Administration console,
for TypeToProcess :
0 -> Excel
1 -> DBF


You can download the code Here. Before installing, look at the Readme
Note: This code was written in VS2005. If you want to use it in VS2003, create a new Pipeline type of project in VS2003 and then just copy the code from the DecodeODBC.cs to the VS2003 class. Also thoroughly test the code before using.

Finally:

The not so good things about this Component are:

1) It has to write the ODBC file locally to disk before parsing. This will create
extra disk I/O. I did test it with multiple submissions of 1 MB DBF files. The performance still seemed
pretty good.

2) The types of Excel files it can process are flat. If you're Excel files to process are
complex, not sure how well this Component will parse to XML.

The good things about this component are:

1) The code to parse the ODBC files is dead simple, looks something like the below:

 OleDbDataAdapter oCmd;
 // Get the filter if there is one
 string whereClause = " ";
 if (Filter.Trim() != " ")
   whereClause = " Where " + Filter.Trim();
 if (this.TypeToProcess == odbcType.Excel)
   oCmd = new OleDbDataAdapter(this.SqlStatement.Trim() + whereClause, oConn);
 else // dbf
   oCmd = new OleDbDataAdapter(this.SqlStatement.Trim() + " From " + filename + whereClause, oConn);
 oConn.Open();
 // Perform the Select statement from above into a dataset, into a DataSet.
 DataSet odbcDataSet = new DataSet();
 oCmd.Fill(odbcDataSet, this.DataNodeName);
 oConn.Close();
 // Write the XML From this DataSet into a String Builder
 System.Text.StringBuilder stringBuilder = new StringBuilder();
 System.IO.StringWriter stringWriter = new System.IO.StringWriter(stringBuilder);
 odbcDataSet.Tables[0].WriteXml(stringWriter);


2) This code can be modified to process other types of ODBC files. The modifications
may be minor.

3) You can filter the data in an incoming Excel or DBF file.

BizTalk R2 RFID and working with a real Reader Device (Phidget)

This entry discusses using a real RFID reader (Phidget) with BizTalk 2006 R2 RFID Beta 2. There are a couple of RFID samples provided with the RFID install, but they involve simulators for RFID devices. If you want to try a real RFID reader sample with BizTalk 2006 R2 RFID, read below.

Jim Bowyer, a Senior Technical Specialist for Microsoft based in Calgary, sent out a short note about an inexpensive RFID reader that has a community DSPI (Device Service Provider Interface)  for BizTalk R2 RFID.

So I ordered the RFID reader, downloaded the DSPI and then tried against my BizTalk 2006 R2 RFID Beta 2 installation.

If you want to try yourself, perform the following steps:

1) If not already done so, register for public BizTalk 2006 R2 Beta 2, then download.
Follow the instructions to install BizTalk RFID.

2) Order the PhidgetsRFID Reader (USB)

I opted to order the Phidget RFID Kit, that includes a set of read only tags.

I live in Toronto, Canada, so the total cost for the RFID kit was:

$79.99 (US)  Kit
$49.74 (US)  Shipping
$11.98 (CAN) Customs Fees
-------------------
$141.71

If you live in the States, you will probably get a cheaper total cost, for reduced shipping fees and no customs charges. I ordered the kit online on Friday, and received it the next Tuesday. So approximately 2 Business days for delivery.
 
Below is the image of what you get in the kit:

The left hand side of the picture contains the various RFID tags, and the right side is the actual RFID reader. A USB cable is also included, so you can connect the RFID reader to a free USB port on your computer.

3) Connect the Phidget RFID reader to your computer via the USB cable.

My host machine is Windows XP. The Phidget device was picked up immediately. No extra drivers etc. were needed.

As below, I have BizTalk 2006 R2 Beta 2 and BizTalk RFID installed on a Windows 2003 VMWare image.
So as below, the extra step in this case was to configure the VMWare image to pick up the Phidgets USB device.

4) Download and install the Phidget DSPI from Irving De la Cruz's Blog 

http://blogs.msdn.com/irvingd/pages/biztalk-rfid-device-provider-dspi-for-phidget-devices.aspx

The instructions provided with the download are top notch and I had it up and running within a few minutes. To install the DSPI, Irving provides a script file
or a well documented manual process using the RFID Manager console. I used the manual process to install and had only a couple of minor problems as described below:

After installing the Phidget Device provider, it would not start (received RFID Manager Error). See below:

During the configuration of Phidget Device provider, an IIS virtual directory hosting a number of  WCF services is created as below:

As below, when trying to browse to one of the services

A Service Unavailable error was reported (see below)


To fix:

On my Windows 2003 Server, WSS (Sharepoint) Services 2.0 was also present (from a previous install) along with RFID.
Therefore as below, I excluded the PhidgetProvider URL from the WSS managed paths.
Also for the PhidgetProvider Virtual Directory, I changed the Application Pool to one that runs under an Administrator account (just to get it to work). Once this was done, the PhidgetProvider would start in the RFID Manager.
To recap, if you are having problems starting the Phidget Provider, ensure that you can successfully browse without error to one of the
.svc services, before trying to start the Phidget Provider.


Another problem I had similar to the one above:
After configuring the Sql Sink Process to capture the RFID reads (using the BizTalk RFID Manager), the process would not start.
As below, another IIS virtual directory is created hosting a number of WCF services. As explained above with the PhidgetProvider service, errors were reported when trying to
browse to one of the TestProcess .svc services. Therefore as above with the PhidgetProvider URL, I excluded the TestProcess URL from the WSS managed paths
and fiddled with the permissions of the App Pool that the service was running under. When I could successfully browse to one of the .svc TestProcess
services, the TestProcess successfully started in RFID manager.

 

5) Test

After following the instructions in Irving's documentation, and the Phidget Device Provider, Phidget Device and Process to capture the reads are all enabled and have started successfully without errors (use the BizTalk RFID manager to check):
As below to test, place one of the tags within a few inches of the reader.

Then as below to see if it worked, using the RFID manager you can view the read tags with the View Tags dialog.


As above in the dialog, each tag has a unique Tag Id associated.

Final thoughts:

- Easy to set up.
- The RFID reader is inexpensive, shipping costs may be expensive though, depending on where you live.
- So far has been stable.
- A great way to prototype/experiment with BizTalk RFID and a real RFID device.

Code-Name Jasper

Named after a venerable Canadian national park (due to the overwhelming influence of Canadians on the Microsoft Data team, who are taking over the U.S. one developer at a time), Jasper is one of the incubation projects from the System.Data team introduced at MIX '07.

Built on top of the Entity Framework, Jasper provides a set of classes that can easily be used to perform CRUD operations on data. While this doesn't sound particularly exciting when phrased like that, the reality is that the classes are generated under the covers at run time.

So if I had a database that contained a Customers table, then Jasper allows me to create a DynamicContext object that has the connection string to the database. Then I can say "DynamicContext.Customers" and a Query object containing the customers in the database will be returned. Each customer object within this query has a set of properties that match to the columns in the table. The magic here is that database schema is being queried at runtime to determine which classes should be included in the application. And once retrieved, the query can easily be used to bind to elements on an ASP.NET Web page.

There are a number of interesting aspects as the talk gets closer to a real-world scenario. For example, it seems to be taking the Ruby approach of 'by convention' rather than 'by declaration'. If the Customers table had a property called address, you can create a class named Customers and add a GetAddress method. When the data classes are generated at runtime, a call to the GetAddress method will be used in the property getter for Address. This is how override certain functionality can be achieved.

All in all, there are some interesting possibilities that could arise from this. If you want to play with what is there right now, check out http://msdn2.microsoft.com/en-us/data/bb419139.aspx

WPF Found a Home!

The more I see about Silverlight, both 1.0 and 1.1, the more I realize that WPF has found a client to service. From the moment I heard about XAML and WPF, I questioned where it was going to fit in the real world. A large part of the 'coolness' of XAML revolved about transitions and transformations. But business applications don't really need that. Business applications are, generally speaking, not flashy, demo-candy-ful systems. So this disconnect between the needs of business and the strengths of XAML left me wondering.

While I don't believe that WPF is going to replace Windows Forms anytime in the near future, I think that it's pretty obvious that the raison d'etre of XAML is to support Silverlight.

But it's important for developers to realize that XAML/WPF/Silverlight is sill new technology. The one thing I haven't seen here at MIX is an integrated development environment. Orcas doesn't have a XAML designer...it has a link to edit the XAML in Expression Blend. The data binding story in XAML is a step back from what ASP.NET developers gained in .NET 2.0. Jasper and Astoria are interesting technologies, but the security and error management stories are still being developed. In other words, temper the excitement of some very, very slick technology with the realization that we still have a little time before it becomes 'real'

Computer Canada Feature Article on Employees as Assets

The April 20th edition of Computer Canada had a feature article on (generally) the asset value that employees bring to a company. I mention this because, as it turns out, I'm quoted in the article. I believe that documentation is not the solution to the knowledge leakage problem discussed in the article. Developers don't like documentation, so the trick is to keep code clean and easy to read. Standards are important. Unit tests are important (they document the assumptions made by the developer). Enforcing both standards and appropriate levels of unit testing would, quite naturally, be even more important.

This combination is a significant benefit (albeit a soft one) that a company can accrue by using tools such as Team Foundation Server. Imposing process on software development efforts can be challenging, however. The metaphor that a development team is like a herd of cats is frequently quite accurate. Getting them to buy into the process is very important to it's success. But a well planned implementation that takes into consideration appropriate levels of training along with a process that mirrors the current situation can greatly help a company achieve it's goal with respect to securing application knowledge.

Are you an aspiring architect?

I guess by the very definition of an architect, you would have to have aspirational qualities. Mohammad Akif & and Dave Remmer, Architect Evangelists with Microsoft Canada are putting on a series of web casts just for you. Check them out....

Architecture 101 (Mohammad, May 24)

http://msevents.microsoft.com/cui/eventdetail.aspx?EventID=1032338971&culture=en-CA

Architecture is the balance between art and engineering, it requires a certain mindset and approach to solving problems. Architects often function as a bridge between the business users and development groups and are increasingly being recognized as a critical community within organizations. Becoming an Architect can often translate in  to an elevated status from a career stage perspective but it is hard to find prescriptive guidance around how to become an architect. Join Mohammad Akif for the first of a four part series focused on aspiring architects. During the Architecture 101 session we will discuss some key ideas around Architecture and define attributes of an architect.

Software development lifecycle and methodologies (Dave, May 31)

http://msevents.microsoft.com/cui/eventdetail.aspx?EventID=1032338974&culture=en-CA

Over the years the various approaches teams have used to develop software have evolved. Join Dave Remmer in the second of a series focused on aspiring architects where we will discuss the various stages projects go through and sample some of the methodologies used by teams developing software. In this session we will compare and contrast the waterfall, agile, RUP, Scrum and MSF methodologies and how they are used within software projects.

Services orientation and other architectural paradigms  (Dave, June 7)

http://msevents.microsoft.com/cui/eventdetail.aspx?EventID=1032338978&culture=en-CA

One of the hottest topics in software architecture is the services oriented approach to building solutions and how this can provide agility, flexibility and reuse. Join Dave Remmer in the third of a series focused on aspiring architects where we will be looking at approaches to architecting software. This session will give an overall description of service orientation and how it differs from object oriented and component based architectures as well as a discussion of some of the organizational challenges teams experience when using a services oriented architecture.

Transitioning from a developer to an architect  (Mohammad, June 14)

http://msevents.microsoft.com/cui/eventdetail.aspx?EventID=1032338980&culture=en-CA

Are you a developer who would like to learn more about becoming an architect? Or how to get formally recognized as one (since you already wear the design and architecture hat along with the developer one)?. Join Mohammad Akif for the fourth and last part of the series focused on aspiring architects, during this session we will discuss how you can attain the skill set required to be an architect and sell yourself as an architect within your organization and industry. We will also provide a list of resources that you can use to continue the transition from a developer to an architect role.

A Big Strike Against TableAdapters

I'm hoping that someone reads this post and corrects me. But I'm not holding out much hope.

I don't normally use TableAdapters, but for a small application I decided that they seem like a reasonable choice. And so long as I was using them on my development platform, all was well and good. The problem arose when I delivered the application.

Like any good developer, I store the connection string to the data store in the configuration file. Which is what I did for this application. When I defined the TableAdapters, I pointed the connection string property to the same setting. Or so I thought. But what actually happened was that the connection string was actually stored in the DLL that contained the adapters. Hard-coded. As in not modifiable through the config files. And, naturally, this wasn't discovered until deployment.

So I started to look for ways to have the TableAdapter pull the connection string from the config files. I figured that this would be a fairly common scenario, so it should be address. Not so much

Apparently there is no way to automatically have TableAdapters use config settings. The "solution" is to use the fact that TableAdapters are partial classes to create a write-only ConnectionString property. It looks like the following:

public partial class FormTableAdapter
{
   public string ConnectionString
      {
         set
         {
            System.Data.SqlClient.SqlConnection conn = new 
               
System.Data.SqlClient.SqlConnection();
            conn.ConnectionString = value;
            this.Connection = conn;
         }
      }
   }
}

Then, where the adapter is instantiated in your code, you set the ConnectionString property to the value from the config file.

To me, this is unacceptable. I was already a little skeptical of TableAdapters (I don't like the DataSet and the data access code in the same assembly), but this takes the cake. It almost seems like they were designed to not allow a reasonable deployment model. Maybe it will get better in the next version, but until then, I'm sticking with DataAdapters.

BizTalk Message Helper Methods

This entry describes two helper methods to interact with messages in a BizTalk orchestration.

The first helper methods will return the string representation of a BizTalk message. The methods are as below:

/// <summary>
/// Pass in a BizTalk Message will return a string
/// </summary>
/// <param name="btsMessage">The BizTalk message to get the string from</param>
/// <returns>The string from the Body of the BTS Message</returns>
public static string GetStringFromBTSMessageBody(XLANGMessage btsMessage)
{
  string result;
  StreamReader reader = new StreamReader((Stream)btsMessage[0].RetrieveAs(typeof(Stream)));
  result = reader.ReadToEnd();
  reader.Close();
  return result;
}

Or:

/// <summary>
/// Pass in a BizTalk Message Part and will return a string
/// </summary>
/// <param name="btsMessage">The BizTalk message part to get the string from</param>
/// <returns>The string from BTS Part</returns>
 public static string GetStringFromBTSMessagePart(XLANGPart btsPart)
 {
   string result;
   StreamReader reader = new StreamReader((Stream)btsPart.RetrieveAs(typeof(Stream)));
   result = reader.ReadToEnd();
   reader.Close();
   return result;
 }

Therefore in an orchestration expression shape, the following code will return the string representation of a BizTalk message using the helper method:

// Below will return the string from a BizTalk Orchestration message called msgBTS.
// strFromBTSMsg is a string variable declared in the orchestration
strFromBTSMsg = BTSClassHelper.MessageHelper.GetStringFromBTSMsgPart(msgBTS);

An alternative to using the above helper method is to create a System.Xml.XmlDocument variable in the orchestration,
then assign the BizTalk message to the variable. Then as below, the OuterXml can be extracted from the XmlDocument:

varDom = msgBTS;
strFromBTSMsg = varDom.OuterXML;

The downside to using the XmlDocument variable, is that the whole message will be loaded into memory and an extra XmlDocument variable must be created in the orchestration.

The second helper method (CreateBTSMsgFromString) will construct a BizTalk message from a string. This method is a copy from this post, with a few minor modifications.
The referenced post describes how a binary message in an orchestration can be constructed programmatically.
Just as a side note, remember a BizTalk message can be:
a) xml
b) Anything else
ie (Word Document, PDF, excel spreadsheet, jpg, flat file,  etc. etc. etc.)

The helper classes can be downloaded at the end of this blog entry, so I will not repeat the code for the helper method CreateBTSMsgFromString.

Therefore in an orchestration expression shape, the following code will construct a BizTalk message from a string using the helper method:

// strFromBTSMsg is a string variable declared in the orchestration
// msgBTS is the BizTalk message to be constructed
// the last parameter is the encoding to apply to the message
BTSClassHelper.MessageHelper.CreateBTSMsgFromString(strFromBTSMsg,
                                                    msgBTS,
                                                    BTSClassHelper.MessageFactoryEncoding.UTF8);  
 

An alternative to using the above helper method is to create a System.Xml.XmlDocument variable in the orchestration,
Then as below, load in the string and then assign a BizTalk message to the XmlDocument

strFromBTSMsg = "<Message><FirstName>Bob</FirstName></Message>"
varDom.LoadXml(strFromBTSMsg);
msgBTS = varDom;

The downside to the above approach compared to the helper method is the extra overhead of the System.Xml.XmlDocument variable.

You can download the helper classes and a quick example on how to use them Here (Zip File)
Read this before trying to run

 

Turning an Immutable Message in BizTalk into a Mutable message

One thing that you learn pretty fast in BizTalk is that messages in an orchestration are immutable/read only.

If you need to modify a message in a BizTalk orchestration, you are pretty well restricted to using a Construct shape with encapsulated Transform and/or Message Assignment shapes to create a modified version or a copy of the original message. Distinguished fields,xpath statements, BizTalk maps, custom .Net components, etc. can be used to modify the message.

Below is one simple technique that can be used to modify a message anywhere in an orchestration.
Helper class(s) are required, but in certain situations (explained below) this technique can be used to easily modify a message anywhere in an orchestration.

Below is an orchestration where this technique is used:

 

Very simply this orchestration subscribes to a PO xml message and then produces a
final Invoice XML message that is send out from the orchestration.

Below are the artifacts used in the solution:

The artifacts for the BizTalk project, BTS_Immutable_To_Mutable include:

1) ClassOrchestration.odx (Orchestration as above)
2) Invoice.xsd (schema for outgoing Invoice XML message)
3) PO.xsd (schema for incoming PO XML message).
4) InvoiceClassGen.bat and POClassGen.bat

Below is the InvoiceClassGen.bat file:

The above uses the .Net xsd.exe utility to generate an Invoice.cs class from the Invoice.xsd schema.
This Invoice.cs class is used in the Class_Immutable_To_Mutable project as below. 

The artifacts for the Class Library project, Class_Immutable_To_Mutable include:

1) Helper.cs (Helper Class to populate the some of the fields of the Invoice)
2) Invoice.cs (Invoice class for variable in the orchestration)
3) PO.cs (PO class for variable in the orchestration)

This orchestration will:

1) Accept a PO xml message

2) As below, in an expression shape, assign the BTS PO message to a BTS variable message of type PO.cs

// Set the BTS Variable PO to the incoming BTS Message PO
varPO = msgPO;

3) As below, in an expression shape, populate some of the Invoice fields from the PO fields:

// Populate some of the fields in the BTS Invoice Variable,
// from the BTS PO variable fields.
varInvoice.TotalAmt = varPO.Amt;
varInvoice.TotalCount = varPO.Qty;

4) As below, in an expression shape, call a helper class to populate and return the Invoice Items class:

varInvoice.Items = Class_Immutable_To_Mutable.Helper.mapPOItemsToInvoiceItems(varPO);

5) As below, in an expression shape, call a helper class to return and assign the description for the invoice Description field.

// Set the BTS Variable Description field
varInvoice.Description = Class_Immutable_To_Mutable.Helper.GetInvoiceDesc(varInvoice);

6) As below, in an expression shape, call a helper class to return and assign the version for the invoice Version field:

// Set the BTS Message Invoice Version field
varInvoice.Version = Class_Immutable_To_Mutable.Helper.GetInvoiceVersion();

7) Finally at the end, in a Construct/Message Assignment shape, construct the the outgoing BTS Invoice message:

// Create the BTS Invoice message from the Variable Invoice message
msgInvoice = varInvoice;

8) Send out the Final Invoice XML message
 
So after all of this, could a BizTalk map been used to create the Invoice message from the PO message. The answer is yes or no depending on the mapping logic that is needed.

This leads to when use this method:
Creation of a message requires multiple calls to Helper components/Business Rules to create the message.

Some of the upsides to using this approach are:
1) Using the above technique takes away the restriction of the immutable message and working with a mutable variable in the orchestration.
2) Intellisense is available on the variables inside of the orchestration.
3) The variable can be modified directly in an expression shape inside of the orchestration, without the use of distinguished fields or xpath statements. 

Some of the downsides to using this approach are:
1) The overhead of deserialization and serialization from Message to Variable and visa versa.
2) Creating and maintaining the Helper classes (in this case PO.cs and Invoice.cs)

You can download the above example HERE (Zip File).
Read the Readme before installing and running the example.

For a similar example, goto Here
and download the Using a Custom .NET Type for a Message in Orchestrations example.