BizTalk -> More on Constructing Messages and Configuration Information for an Orchestration

I recently was reading Scott's entry on Constructing Messages from Scratch with Embedded Resources. This is a good method, so take a look a look at it HERE.
Below is a variant on Constructing Messages in an Orchestration. This also can be used as a method to read any type of configuration information into an Orchestration.
This method needs:

1) A Sql Server table to store the templates for the XML Messages and/or Configuration Information.
2) A BizTalk XML Schema.
3) A BizTalk Map.

A Sql Server Table
Below is a create statement for the table with three columns:

if exists (select * from dbo.sysobjects where id = object_id(N'[GenericCodes]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
drop table [GenericCodes]

CREATE TABLE [GenericCodes] (
 [Code] [varchar] (50)  NOT NULL ,
 [Value] [varchar] (7000)  NOT NULL ,
 [Description] [varchar] (500) NULL ,

Some sample data for the table is as below:

a) The Code column is the Primary Key of the table that describes the contents of the row.
b) The Value column contains the actual template XML message or Configuration value.
c) The Description column contains a description/purpose of the template xml message or configuration value,
In the example sample data above, the first two rows:


contain template xml for messages that need to be constructed from scratch in an orchestration, sample as below:

<ns0:ShippingInfo ShipToAddress="Some Address" ShipToCity="Some City" ShipToCountry="Some Country" xmlns:ns0="http://ConstructingXMLMessages.ShippingInfo" />

The remainder of the rows, contains configuration information for a process:

Request401_FIFOMIBSend -> How many Processor orchestrations can be running at one time (1 or many)
Request401_MaxNumberofFilesToBatch -> How many incoming messages to batch together using a looping sequential convoy
Request401_TestMode -> To indicate if the system is in test mode or production mode.
Request401_TimeoutForFileBatching -> The max time to wait for the next message to batch.
Response401_DB2Process401Results_CommandTimeout -> Passed to a ADO.Net Helper component to update a DB2 database.
Response401_DB2Process401Results_Retries -> Used to configure a Component Retry Pattern.
Response401_DB2Process401Results_RetryInterval -> Used to configure a Component Retry Pattern.
TwoDigitYearSupport -> To help translate two digit years to four digit years in a map.

A BizTalk XML Schema

Create a BizTalk Schema called XmlTemplateAndConfigInfo.xsd as below:

Each node (could be an element or attribute) is of Data Type string. Notice that each nodes name matches a Code column name from a row in the Configuration table.
(Note: For this method to work, It is important to match the names. This will be explained below)
Each node is promoted to a distinguished field.

A BizTalk Map

Create a BizTalk Map called Map_PopulateXMLTemplateAndConfigInfo.btm as below:

Notice that the source and destination schemas use the same schema (XmlTemplateAndConfigInfo.xsd) that was described above.

A DataBase Lookup Functoid and Value Extractor functoid are used to populate each node on the Destination Schema.

The Configuration information for a DataBase lookup functoid looks like below:

The first Input parameter for a DataBase lookup functoid is the lookup value. In this case it should be set to a value of a row from the Primary Key Code column in the configuration table GenericCodes. Instead of Hard Coding this value in for example (Request401_TestMode), a link from the source schema is used. When the link has been created, go to the properties window as below and for the Source Links property set the value to -> Copy name. Therefore instead of the value of the source node being mapped over, the name of the node will be mapped over. This is why it is important for the name of each node to match a rows Code value in the GenericCodes table.  

The second Input parameter for a DataBase Lookup functoid is the DataBase Connection string A UDL file was used to hold the connection string for the database as described HERE

A Value Extractor Functoid is then used to extract the XML Template or Configuration Value and map it the Destination node.

A Sample Orchestration to use the XML templates and Configuration Values

Sample Orchestration as below:

The following Messages and Variables are created in the Orchestration View for the Orchestration:

msgEmptyXmlTemplateAndConfigInfo MessageType is set to the XmlTemplateAndConfigInfo.xsd schema discussed above.
msgPopulatedXmlTemplateAndConfigInfo MessageType is set to the XmlTemplateAndConfigInfo.xsd schema discussed above.
msgCreateFromXMLTemplate is a test message that is constructed from an XML Template.

Both varXMLDomEmptyXmlTemplateAndConfigInfo and varXMLDomCreateFromXMLTemplate are variables of type -> System.Xml.XmlDocument
They are used to help in message construction.

1) In the First construct shape in the Orchestration -> Construct EmptyXmlTemplateConfigInfo
A Message Assignment shape is used to construct message -> msgEmptyXmlTemplateAndConfigInfo
The code in the Message Assignment is as below:

// First need an empty message to construct the Empty Config Info XML Message
// Note: The XML is just the Root Node Name and the target namespace of the schema XmlTemplateAndConfigInfo.xsd
// This is hardcoded, but should never change.
// This constructed message is needed in the next orchestration shape which uses a map to construct the msgPopulatedXmlTemplateAndConfigInfo 
// message.
varXMLDomEmptyXmlTemplateAndConfigInfo.LoadXml(@"<ns0:XmlTemplateAndConfigInfo xmlns:ns0=""http://Demo.XmlTemplateAndConfigInfo""/>");
msgEmptyXmlTemplateAndConfigInfo = varXMLDomEmptyXmlTemplateAndConfigInfo;

2) In the Second construct shape -> Construct PopulatedXmlTemplateConfigInfo
A Transform shape is used to construct message -> msgPopulatedXmlTemplateAndConfigInfo.

The map -> Map_PopulateXMLTemplateAndConfigInfo.btm is used in the transform shape.
Message msgEmptyXmlTemplateAndConfigInfo is used as the Transform Source Message
Message msgPopulatedXmlTemplateAndConfigInfo is used as the Transform Destination Message 

The map -> Map_PopulateXMLTemplateAndConfigInfo.btm will then be invoked to construct the
msgPopulatedXmlTemplateAndConfigInfo. The nodes in this message will be populated from the values in Sql
table GenericCodes.

3) In the third construct shape, this is an example of constructing a message from an XML template,
that originates from the Sql table GenericCodes.
A Message assignment shape is used to construct message -> msgCreateFromXMLTemplate

// Load the Template XML from the msgPopulatedXmlTemplateAndConfigInfo.BaseXML_401ExtraInfoRequest node.
// Because Distinguished fields are used, then the intellisense works as below
msgCreateFromXMLTemplate = varXMLDomCreateFromXMLTemplate;

4) Below is an example of reading the msgPopulatedXmlTemplateAndConfigInfo for Configuration information.
The Decide shape (Decide If In Test Mode, Test Mode Rule) uses the following expression to determine if in test mode:

// Intellisense can be used because the Nodes in the XmlTemplateAndConfigInfo.xsd were promoted as
// Distinguished Fields.
msgPopulatedXmlTemplateAndConfigInfo.Request401_TestMode == "1"


The good things about this method are:

1) No Code. You do not have to write any .Net Helper code to help read in the XML templates and Configuration Values into an Orchestration.
2) This is a simple method that I originally used at a client where .NET skills were scarce. Therefore why burden them with .NET code to maintain/test etc. when they do not have the resources to do so.
3) As soon as the value in the GenericCode tables changes it will be automatically visible.
4) A tool such as Query Analyzer or the Sql Enterprise Manager can be used to change the XML Template and Configuration values in the database.
5) This is a central repository of configuration data that any BizTalk Server in the Group will point to.

The not so good things about this method are:

1) For each node to be populated in the Map_PopulateXMLTemplateAndConfigInfo.btm map, a separate round trip to the database must be done. But we are talking milliseconds.

2) Each orchestration that needs the configuration values, would have to add the necessary Construct shapes to invoke the Population map. But, it would be easy to create one orchestration to centralize this procedure and then let the other orchestrations call into this "Configuration Orchestration" using a Call Orchestration Shape. The "Configuration Orchestration" would then return an msgPopulatedXmlTemplateAndConfigInfo message as an out parameter.

Just a Note

I always try to do a bit of research on my Blog topics so I can point the reader to other techniques and different views. So below are some links to other blogs on BizTalk and Configuration. Depending on your needs, take your pick:

1) The BizTalk Configuration Dilemma
2) Using the Rules Engine for orchestration configuration values *BizTalk Sample*...
3).NET configuration files with Orchestration Hosts 
4) How to store configuration data in the Enterprise Single Sign-On Database (SSO)
5)  Using the BTSNTSvc.exe.config
6) Maybe BizTalk 2006, has some new features for configuration? Just
installed my copy and have not had the time to look yet.

BizTalk 2004 -> Integrating with Sharepoint 2003 and when NOT to use the BizTalk WSS Adapter

The Sharepoint Send/Receive Adapter released last March of 2004 on GotDotNet can be downloaded HERE.
It provides functionality to add (Send) items from BizTalk into Sharepoint Libraries and to pull (Receive) items that have been added into a Sharepoint Library and to submit them into the BizTalk environment. The Sharepoint Adapter is a great example of how a custom BizTalk 2004 Adapter can be built to allow BizTalk to communicate with any type of application.

Last year in 2004, I was part of a team working on a Sharepoint 2003 Portal/BizTalk 2004 project. BizTalk 2004 was used to integrate an HR (Human Resources) system with Sharepoint WSS libraries and Sharepoint Portal User Profiles. In fact, BizTalk was used to aid in the replication of the data from the HR system (in real time) to the Sharepoint Portal Server (i.e. any changes to the HR Employee Profiles (updates/inserts/deletes) were to be replicated in a timely manner to the Sharepoint Portal Server(s) WSS Lists and/or User Profiles).

More Specifically:

1) Employee profiles from the HR system were inserted, updated and deleted in a Sharepoint Library Lists.
2) Employee Resumes (doc,pdf,rtf formats) from the HR system were inserted,updated and deleted in Sharepoint Document Library Lists.
3) Employee Photos (tiff,jpg formats) from the HR System were inserted updated and deleted in Sharepoint Picture Library Lists. 
4) Employee profiles from the HR system were used to create, update and delete Sharepoint User Profiles and to also generate MySites for each User Profile.
Note: User Profiles and MySites is a feature of Sharepoint Portal Server 2003 and not Windows Sharepoint Services (WSS). User Profiles can be used to store information about each portal user. Additionally each portal user can have their own MySite, where they can add/modify and delete their own specific content and have control of their own Web Site within the portal.

Some of the specific requirements/functionality/infrastructure of the integration project:

1) There were two Sharepoint 2003 Portal Servers:
a) English Portal Server (The English Portal Server was a load balanced web farm implemented with two physical servers hosting the English Portal.)
b) French Portal Server

a) The English portal hosted the English User Profiles and English MySites.
On the English Portal there were two main WSS sub sites -> an English Sub Site and a French Sub Site.
Note: there was some discussion on moving the French Sub Site to the Physical French Portal server, but at the time of development the English and French Sub Sites were on the English Portal. Therefore the Employee profile, Resume and Picture libraries were duplicated on the English and French Sub sites of the English portal.
Note: We were trying to have one main list with both the English and French columns at the portal level and then create views of the this main list down on the English and French Sub Sites. This did not work. Therefore the duplicate libraries -> one on each sub site was used as below:

English Employee List:

French Employee List:

b) The French Portal Server hosted the French User Profiles and MySites

3) Any change made about an employee on the HR System (insert/update/delete), was to be replicated across to the Sharepoint Employee Profile List.

4) Any change to an Employee Picture or Employee Resume on the HR System (insert/update/delete), was to be replicated across to the Sharepoint Employee Picture Libraries and Resume Libraries. A Document or Picture Library can be thought of a regular Sharepoint list, except that a File is also associated with the list.

Additionally the HR system could indicate if only the actual Resume file or Picture file changed or only if text information about the the Employees changed (ie, their name or the city they worked in). This could be used to aid in performance -> i.e. if only the persons name changed, then do not bother updating the actual Resume Binary or Picture Binary and only update the Sharepoint List text columns (i.e. First Name, Last Name, City).

Below is the Detail View of the English Picture Library:

Below is the ThumbNail View of the English Picture Library:

5) When a File is added to a Document library, an actual URL (for example: http://spserver/sites/EnglishSite/EmployeePictures/Bob.jpg) 
is generated that can be retrieved from a Sharepoint API method call as below:

url = spListItem.File.ServerRelativeUrl;

This URL was sent back to the calling BizTalk Orchestration, so that it could ultimately update a Sharepoint Portal Server User Profile URL property. For example, one of the default properties for UserProfiles is the picture URL. When the MySite of the user is navigated to, the picture url is used to automatically display the picture on the users MySite as below:

Below is the edit page for Bob the Builder's User Profile:

Below is the MySite for Bob The Builder.
Note: That the User Profile for Bob the Builder is used to populate the MySite.

6) If the particular Sharepoint Library did not exist, then automatically create it.
For example if BizTalk submitted an Employee Profile(s) to Sharepoint for processing and the particular library list did not exist, then: 
a) Automatically create the list.
b) Populate the list from the submitted HR profile(s).

7) Any change made about an employee on the HR System (insert/update/delete), was to be replicated across to the Sharepoint User Profile Database.

8) A possible future enhancement was to have the information about Resumes also stored in Sharepoint Portal Areas. In this implementation Areas were used to represent the taxonomy of the company.

From the above functionality and requirements it became clear that it was going to be difficult or impossible to use the existing WSS adapter directly. Much finer grained control of the Sharepoint functionality was needed in order to implement the system.
Therefore there were two choices:

Choice 1) Extend the WSS BizTalk Adapter to handle the additional functionality.
Choice 2) Let Biztalk call custom web methods hosted by web services installed on Sharepoint Portal Servers. The custom web methods would receive as parameters information about employees in an XML format and additionally binary employee resumes and pictures. Custom code in the web methods would utilize the Sharepoint API methods in order to insert/update/delete items in the Sharepoint Lists and the Sharepoint User Profiles. The Web Methods could also return results, such as the URL's of the pictures or resumes to set into Sharepoint Portal User Profile properties.

In the end choice 2) was chosen to implement the solution.
The particular customer where I was doing the work had a number of strong .NET/Sharepoint/Web Services developers so it ultimately can down to the lead developers choice and he preferred choice number 2).

Choice 1), is also a viable option, but in retrospect, I am thankful that the custom web services method was chosen. Each Sharepoint List had it's own little quirks and the functionality to populate them varied from list to list. In the end it was much easier to have a separate set of web methods to control the population of each Sharepoint list and  the Portal User Profiles. To make the code more modular, a common set of Sharepoint helper methods was used, so that  code did not have to be duplicated, from web method to web method. If there were more Sharepoint Lists to be populated and the number of list to be populated grew over time, then using an adapter becomes more attractive. One of the things you lose by writing custom web methods, is that each List is tightly bound to a web method, so if the number of Sharepoint lists grow then the number of web methods also grows.

Below is a more detailed discussion of how the Custom Code was implemented and discusses in some detail using the Sharepoint API. Additionally at the end of this blog you can download the code that populates the Sharepoint Employee List.

A simplified flow of information from the HR System to Sharepoint went something like this:

1) From the HR System, information about the employees was delivered into Biztalk in an XML format.

2) A BizTalk Orchestration would then subscribe to the incoming Employee message.

A simplified Employee XML message is as below:

<ns0:EmployeeProfiles xmlns:ns0="http://BobTheBuilder">
    <LAST_NAME>The Builder</LAST_NAME>
    <LAST_NAME>The Contractor</LAST_NAME>

The XML was converted to a string in the BizTalk Orchestration Expression shape. An example is as below.

varXMLDomEmployee = msgEmployee;
// Now the following can be used to extract the XML as a string
strEmployeeXML = varXMLDomEmployee.OuterXML.
More details about the above can be found HERE

If the Employee information was associated with a Resume, or a Picture, then in the Orchestration a .NET Helper Component using ADO.NET was called to retrieve the file from Sql Server. The code would return the Resume or Picture Binary as a string, from a Sql Server text column . An excerpt of the code is as below:

string result;
System.Byte[] resumeImageBytes;
resumeImageBytes =  (System.Byte[]) this.sqlCmdGetResumeImage.ExecuteScalar();
result = System.Convert.ToBase64String(resumeImageBytes);
return result; 

3) The string of Employee XML and optionally the Base64String of the Employee Resume or Picture was then sent to the Sharepoint Portal Servers from the BizTalk Orchestration via a Web Port to a Web Service installed on the Sharepoint Portal Servers. More detail about this can be found HERE

4) The particular web method on the portal Server would then accept as parameters the string of XML and optionally the binary Resume or Picture (as a string).

5) On the Web Server side, strongly typed datasets were used to hold configuration information for the Sharepoint Lists/Portal User Profiles and also included mapping information between the incoming Employee XML and the Sharepoint List Columns/ User Profile Properties. The dataset is as below.

XML installed on the Web Service side would then populate the dataset. A sample of the XML to populate the dataset is as below:

<?xml version="1.0" standalone="yes" ?>
<SharePointListData xmlns="">
    <SharePointListDescription>Employee Profiles List</SharePointListDescription>
    <SharePointListTitle>Employee Directory</SharePointListTitle>
    <TemplateName>Custom List</TemplateName>
    <SharePointColumnName>Last Name</SharePointColumnName>
    <SharePointColumnName>First Name</SharePointColumnName>
    <SharePointListDescription>Les Profils d'employé Enumèrent</SharePointListDescription>
    <SharePointListTitle>Annuaire d'employé</SharePointListTitle>
  <TemplateName>Custom List</TemplateName>
    <SharePointColumnName>Nom de famille</SharePointColumnName>

The above XML contains all the information necessary to create and/or get a handle to the necessary Sharepoint list(s) and contains the mappings between the incoming XML and the Sharepoint Columns. For example: The LAST_NAME node in the incoming XML would map to the Last Name column in the English Sharepoint list and would also map to the Nom de famille column in the French Sharepoint list. This configuration XML was then stored in the Web Services Cache. When the XML file changed, the File Dependency Cache would automatically load in the new XML file. In this way, the configuration information about the Sharepoint Lists and mappings could be changed on the fly.

Below is Sample Web Dependency Cache loading code. 

// Save to the Web Cache. Put dependency on file, so when file is changed, old copy will get flushed from the Cache
new System.Web.Caching.CacheDependency(System.Web.HttpContext.Current.Server.MapPath(fileName)));

6) The Web Method would then call a helper class, that would actually do the processing
against the Sharepoint List(s)/User Profile Database with the Employee XML and the Configuration Dataset.

a) The Helper class would first determine if the required Sharepoint List(s) existed. The configuration dataset contained the necessary information to create the Sharepoint List(s) if they did not exist, otherwise just get a handle to the existing list. The code to create the list looks something like the below:

  // If the below WSS list exists, then get it.
  employeeList = this.siteCollection[0].AllWebs[(sharepointListRow.WebLocation)].Lists[sharepointListRow.SharePointListTitle];
 catch (System.ArgumentException ex)
  // Means the WSS list did not exist, therefore create it.
  employeeList = null;
  System.Diagnostics.Debug.WriteLine("The List does not exist. error is as follows -> " + ex.ToString()); 
 if (employeeList == null)
  // Create the WSS List
  string listTitle="" ,listDescription = "";
  listTitle = sharepointListRow.SharePointListTitle;
  if (!sharepointListRow.IsSharePointListDescriptionNull())
   listDescription = sharepointListRow.SharePointListDescription;
   listDescription = sharepointListRow.SharePointListTitle;
  // This will add the new list to the List Collection of the Site.
  siteCollection[0].AllWebs[sharepointListRow.WebLocation].Lists.Add(listTitle,  // Title of New List
                  listDescription,       // Description of the List
                  Microsoft.SharePoint.SPListTemplateType.GenericList); // Type of list need to create

  // This will get a handle to the just added list
  employeeList = siteCollection[0].AllWebs[sharepointListRow.WebLocation].Lists[listTitle];
  // Now add the necessary columns to the newly created list.
  // Also add the columns to the Default View of the List.
  // Note: This code only handles text type columns.
  Microsoft.SharePoint.SPView defaultView = employeeList.DefaultView;
  Microsoft.SharePoint.SPViewFieldCollection spViewFieldsCollect = defaultView.ViewFields;
  foreach (SharePointListData.SharePointListColumnsRow spColumn in sharepointListRow.GetSharePointListColumnsRows())
    employeeList.Fields.Add(spColumn.SharePointColumnName ,Microsoft.SharePoint.SPFieldType.Text,false);

b) The Helper class would then take the Configuration dataset and use it to map the Employee XML into the correct Sharepoint List columns. An indicator is also passed in with the employee XML. If it is set to -1, then the Employee in the Sharepoint list must be deleted, otherwise the Employee must be added to the list if it is not found or updated if it is found. A column in the Sharepoint list is used as a primary key for CAML Queries.

public void setDataInEmployeeSPList(string employeeXml,Microsoft.SharePoint.SPList employeeList, SharePointListData.SharePointListRow sharepointListRow )


 System.Xml.XmlElement employeeChildNode;
 System.Xml.XmlDocument employee = new System.Xml.XmlDocument();
 System.Xml.XmlNode xmlNode;

 xmlNode = employee.DocumentElement["PROCESSLISTINDICATOR"];
 string deleteListIndicator = xmlNode.InnerText;

 xmlNode = employee.DocumentElement[sharepointListRow.XMLSourcePKColumn];
 string employeeID = xmlNode.InnerText;

 Microsoft.SharePoint.SPListItem spListItemEmployee;
 Microsoft.SharePoint.SPListItemCollection spListItemsEmployees;
 // Call method that does CAML query to try to find the Employee in the list.
 // If found, we will either update or delete it in the list.
 spListItemsEmployees = this.QueryListForItems(employeeList,
 if (deleteListIndicator == "-1")
  if (spListItemsEmployees.Count == 1)
   // Means must delete the Employee from the list
   spListItemEmployee = spListItemsEmployees[0];
   // Means could not find the item in the list to delete 
   System.Diagnostics.Debug.WriteLine("Could not find the Employee to delete in the library list.");    
 else // Either the Employee must be updated or deleted in the Employee list
  spListItemEmployee = null;
  switch (spListItemsEmployees.Count)
   case 1:  // Found an existing Employee -> Update
    System.Diagnostics.Debug.WriteLine("Found in library");
    spListItemEmployee = spListItemsEmployees[0];
   case 0: // Employee does not exist. Create a new employee item in the list
    System.Diagnostics.Debug.WriteLine("Did not find in library");
    spListItemEmployee = employeeList.Items.Add();
  // Now set the information from the Employee XML to the Employee Sharepoint List Columns
  foreach (SharePointListData.SharePointListColumnsRow employeeColumnsRow in sharepointListRow.GetSharePointListColumnsRows())
   employeeChildNode = employee.DocumentElement[employeeColumnsRow.XMLSourceColumnToMap];
   if (employeeChildNode != null)
    spListItemEmployee[employeeColumnsRow.SharePointColumnName] = employeeChildNode.InnerText;
  if (spListItemEmployee != null)

/// <summary>
/// Will query a list in Sharepoint. Only one column in the list is queried. For example City = "Toronto"
/// </summary>
/// <param name="listToQuery">The SPList to query</param>
/// <param name="columnToQuery">The name of the column to query in the SPList</param>
/// <param name="queryValue">The value to query for in the column</param>
/// <returns>A collection of List items that were returned by the query</returns>
public Microsoft.SharePoint.SPListItemCollection QueryListForItems(Microsoft.SharePoint.SPList listToQuery,
 string columnToQuery,
 string queryValue)
 Microsoft.SharePoint.SPListItemCollection result;
 Microsoft.SharePoint.SPQuery spQuery = new Microsoft.SharePoint.SPQuery();
 spQuery.Query = "<Where><Eq><FieldRef Name='"+ columnToQuery +"'/><Value Type='Text'>" +  queryValue + "</Value></Eq></Where>";
 result = listToQuery.Items.List.GetItems(spQuery);
 return result;


c) In the end, there was not too much code to write, somewhere between 50 and 100 lines of code to get the job done.

Some Notes about the Sharepoint API's

1) The Sharepoint API's will give the maximum amount of control over a WSS or Portal Site.
2) In order to use the WSS API you must make a reference to Microsoft.SharePoint.dll. The Sharepoint dll contains methods to interact with Sharepoint lists, Document/Picture libraries and much more.
3) In order to use the Sharepoint Portal API you must make a reference to the Microsoft.SharePoint.Portal.dll. The Portal dll contains methods to interact with a portals User Profile database, areas and much more.
4) The Sharepoint API's are organized into a series of collections. For example:
a) A Portal contains a collection of WSS sites
b) A WSS site contains a collection of Sharepoint Lists
c) A Sharepoint List contains a collection of Sharepoint columns and Items that contains the data in the List
5) The API's are fairly intuitive, but like everything have their own set of nuances.
For example when adding an new item into a Generic list, it looks something like the below:

// Just add the new item to the Items collection of the list
spListItemEmployee = employeeList.Items.Add();

But when adding a new item into a Document Library or Picture list, it looks something like the below:

// Must add the new item to the lists Folder Files collection 
file = resumeList.SharePointFolder.Files.Add(xmlNode.InnerText /* Note.doc comes from PS+ ".doc"*/,Convert.FromBase64String(resumeByteStream)); 

BizTalk 2006

When BizTalk 2006 ships, it will include a built in out of the box WSS adapter. I am sure there will be improvements to the existing BizTalk 2004 WSS adapter that can be downloaded from GotDotNet. Read more about the new out of the box WSS adapter and other BizTalk 2006 Adapter additions and enhancements HERE

HERE download the sample code that populates a Sharepoint Employee List. Read the Readme.txt before installing and running.
Note: You must have Sharepoint WSS installed and Visual Studio 2003 on a Windows Server 2003 machine in order for the sample to work.





BizTalk 2004 -> Enterprise Integration Patterns and BizTalk

Why use Patterns ?

1) They help in the design of an Integration Project.
Just like any other type of project, you need a good design in order to successfully develop and implement your solution.
For example:
If you are part of a team building a new ASP.NET application, most likely there will be some design and architecture development for a framework that will support the new ASP.NET application. The same can be said for an integration project, but on a different level.

Individual patterns can be described by one (Shape) as below.

With an integration project, the goal is to chain the patterns together as below:

The above diagrams were created in Visio. The Visio template that was used to create the above diagrams
can be downloaded HERE.

The above diagram is NOT a BizTalk specific solution, but really is an Integration (flow) diagram that describes the flow of messages through various system(s) or applications(s).

How the integration solution is implemented, is really up to the technology or technologies that are available
to the developers. It could be implemented using one or more of the following technologies:

a) .NET
b) Java
c) a Database such as Sql Server could be used to help in the implementation,
d) MSMQ could be used to help reliably deliver the messages from application to application
e) BizTalk could be used to help carry out the integration.
f) many more tools are available.

2) Patterns create well designed integration solutions, that produce
implementations that can be easily modified and maintained in the future.

3) Patterns can help with Testing 

4) Patterns are simple (for the most part).

a) Splitter (Splits Messages)
b) Aggregator (Aggregates Messages)
c) Content Enricher (Will add more information or missing information to messages)
d) Normalizer (Normalizes messages into one common format)
e) There are approximately another 60 Patterns.

The name of the pattern often describes what the pattern does. But there is much to consider in an individual pattern.
For example:
In an Aggregator pattern, what happens if there are ten expected messages to be aggregated , but only nine messages show up?

Patterns and BizTalk

BizTalk can be used to implement patterns.
For example: If you have worked with BizTalk, you can quickly think of two ways to easily
split a message (a couple are an Envelope or split the message in an orchestration).

The following features in BizTalk aid in implementing patterns (Note: This is not a complete list)
a) The publish and subscribe model that BizTalk implements.
b) Physical Receive and Send Ports.
c) The correlation features in BizTalk (Correlation Sets, Property Schemas etc).
d) Mapping in BizTalk
e) The Delivery Notification and Ordered Delivery properties in logical ports
g) Role Links and Parties
f) more and more features.

Not all Patterns are a snap to build in BizTalk, but
BizTalk does provide a great set of features to implement many of the patterns.

Where to find BizTalk examples of Patterns:

There are many places to find examples of patterns in BizTalk.
Below are just a few:

Bloggers Guide to BizTalk, has a whole section devoted to Patterns in BizTalk


My last three Blog entries have discussed some examples of patterns



BizTalk 2004 -> Aggregator Pattern Using a Map and Orchestration. Also a Content Based Router Pattern

Below is a method to aggregate many incoming messages into one message using a Map inside of a BizTalk Orchestration.

This Aggregator pattern is the third demo of a webcast I did on March 28 2005 called: BizTalk Server 2004 Implementing Enterprise Integration Patterns. It can be viewed HERE as a recorded Webcast. The full BizTalk solution can be downloaded at the end of this blog.

In this particular scenario, the messages split from the Splitter orchestration have been processed by a Production Order application. These processed messages
now have to be aggregated back into one complete message.

The three abbreviated incoming split messages to be aggregated look something like:

<Rolls UniqueOrderID="10049" IdMill="Mill_One" OrderCount="3" UniqueIdentifier="7e96e060-3685-48bb-8113-9ae9957d8c4b">
  <trk_unit_roll_ageable trk_unit_id="10003" pro_product_id="10031" sch_prod_order_id="10049"/>
  <trk_unit_roll_ageable trk_unit_id="10006" pro_product_id="10022" sch_prod_order_id="10049"/>   

<Rolls UniqueOrderID="10043" IdMill="Mill_One" OrderCount="3" UniqueIdentifier="7e96e060-3685-48bb-8113-9ae9957d8c4b">
  <trk_unit_roll_ageable trk_unit_id="10004" pro_product_id="10024" sch_prod_order_id="10043"/>
  <trk_unit_roll_ageable trk_unit_id="10005" pro_product_id="10022" sch_prod_order_id="10043"/>
  <trk_unit_roll_ageable trk_unit_id="10008" pro_product_id="10022" sch_prod_order_id="10043"/>     

<Rolls UniqueOrderID="10048" IdMill="Mill_One" OrderCount="3" UniqueIdentifier="7e96e060-3685-48bb-8113-9ae9957d8c4b">
   <trk_unit_roll_ageable trk_unit_id="10007" pro_product_id="10022" sch_prod_order_id="10048"/>     

The final aggregated message to construct looks something like :

<Rolls IdMill="Mill_One">
  <trk_unit_roll_ageable trk_unit_id="10003" pro_product_id="10031" sch_prod_order_id="10049"/>
  <trk_unit_roll_ageable trk_unit_id="10006" pro_product_id="10022" sch_prod_order_id="10049"/>
  <trk_unit_roll_ageable trk_unit_id="10004" pro_product_id="10024" sch_prod_order_id="10043"/>
  <trk_unit_roll_ageable trk_unit_id="10005" pro_product_id="10022" sch_prod_order_id="10043"/>
  <trk_unit_roll_ageable trk_unit_id="10008" pro_product_id="10022" sch_prod_order_id="10043"/>
  <trk_unit_roll_ageable trk_unit_id="10007" pro_product_id="10022" sch_prod_order_id="10048"/>     

One solution to accomplish the above is discussed below :

1) Create an Orchestration to accept the incoming messages to be aggregated.

In the Orchestration:

a) The first Receive Shape initializes a Correlation Set to aid in the aggregation.
A looping sequential convoy pattern is used in the Orchestration to receive the incoming messages.The first message received by the orchestration is interrogated to determine the expected number of  messages to receive. This attribute can be seen on the message such as : OrderCount="3". Therefore in this case loop
three times in the orchestration, because three messages are expected. Another attribute on the first incoming message is also interrogated such as : 
UniqueIdentifier="7e96e060-3685-48bb-8113-9ae9957d8c4b". This value will be used to correlate the three incoming messages, into the correct running instance of the same orchestration. After the first message is received, only messages with  UniqueIdentifier="7e96e060-3685-48bb-8113-9ae9957d8c4b" will be accepted. If another messages comes in with a different identifier such as :  UniqueIdentifier="D9E9DF5E-0071-4ddd-962D-FD4478E4C6FE", then another instance of the Aggregator Orchestration will be created that just accepts messages with that identifier.
Note: The attributes OrderCount and UniqueIdentifier were added to the split messages by the Splitter Orchestration.

b) Three messages are declared in the Orchestration:

i)   msgInternalRollsAll (This messages will be the final aggregated message sent out). 
ii)  msgInternalRollsAllTemp (This message is of the same type as above, but is used to help in processing, discussed below)
iii) msgInternalRollsOneOrderID (This message represents each of the separate incoming messages to be aggregated)

c) In the orchestration create a valid instance of the msgInternalRollsAll message. This is so the mapping (discussed below) will execute correctly the first time around in the loop.
The below is done inside an expression shape.

varXMLDomForMsgInternalRolls.LoadXml(@"<ns0:Rolls IdMill=""IdMill_0"" xmlns:ns0=""http://RollsInternal"" />") ;
construct msgInternalRollsAll
   msgInternalRollsAll = varXMLDomForMsgInternalRolls;

d) Use a loop shape. The number of iterations for the loop is controlled by the attribute of the first incoming
message such as : OrderCount="3"  

e) In the loop, assign the msgInternalRollsAllTemp to the msgInternalRollsAll message.
This is done so we can keep appending to the final output message as each loop executes.
The is accomplished inside an expression shape of the orchestration.

construct msgInternalRollsAllTemp
   msgInternalRollsAllTemp = msgInternalRollsAll;
f) In the loop invoke a map that will keep appending to message -> msgInternalRollsAll:

The Destination message for the map is msgInternalRollsAll. A map with two inputs (two xml messages) is used to create the output message.
The first source for the map is the msgInternalRollsOneOrderID message that contains the contents  of each incoming split message to be appended to the final outgoing message   -> msgInternalRollsAll. The second source to the map is the msgInternalRollsAllTemp message. This is an interim message that is used to keep a copy of the final output message (msgInternalRollsAll) for each iteration of the loop. The map will then take the contents of two messages (nodes) and combine them into one node.
A looping functoid is used to combine the contents of two nodes into the final source node.
A more detailed explanation of this method can be found HERE.
g) Use a decide shape to determine if more messages are expected.
If more messages are expected, use a receive shape to receive the next incoming split message, then go to the top of the loop to append this split message to the final output message. If no more messages are expected then the final output message (msgInternalRollsAll) is sent to the
correct destination using a Content Based Router Pattern as below.

Content Based Router Pattern

Initially a message came into the system from a particular Party. In the demo, messages can be received from three different parties : Mill One, Mill Two, Mill Three.
These messages were then Normalized, Split, Aggregated and now the final message must be routed back to the Mill that originally sent the message. Therefore the outgoing message contains some information about where it is going. In this case there is a attribute in the outgoing message such as :
IdMill="Mill_One". This value in the attribute will be used to dynamically route the message to the correct party. Therefore to implement this in a BizTalk Solution, Role Links in an Orchestration and Parties created in BizTalk Explorer were used. In the orchestration the following code in a expression shape is used to route the
message to the correct party:

varStrMillToSendTo = msgInternalRollsAll.IdMill;
RoleLinkToCorrectMill(Microsoft.XLANGs.BaseTypes.DestinationParty) = new Microsoft.XLANGs.BaseTypes.Party(varStrMillToSendTo, "OrganizationName");

If you have not used Role Links and Parties before, go to HERE and HERE for a more detailed explanation.

Again the full sample can be downloaded HERE . Read the ReadMe.txt file in the zip file, before unzipping and installing.

More Aggregator Patterns

Note: There are multiple ways to implement an Aggregator Pattern in BizTalk.
Another method includes using XML Document type .Net objects to build up the message in the orchestration:
Please see HERE and HERE for other examples.




BizTalk 2004 -> Splitter Pattern Using a Map and Orchestration.

Below is a method to Split one message into multiple messages using a Map inside of a BizTalk Orchestration.

This Splitter pattern is the second demo of a webcast I did on March 28 2005
called: BizTalk Server 2004 Implementing Enterprise Integration Patterns. It can be viewed HERE
as a recorded Webcast. The full BizTalk solution can be downloaded at the end of this blog.

In this particular scenario, an Incoming message contains several production orders.
This message was created from a Normalizer(this is a pattern) Orchestration. A discussion
of the Normalizer Pattern can be found HERE

The abbreviated incoming Normalized XML message looks something like the below:

  <trk_unit_roll_ageable trk_unit_id="10003" pro_product_id="10031" sch_prod_order_id="10049"/>
  <trk_unit_roll_ageable trk_unit_id="10004" pro_product_id="10024" sch_prod_order_id="10043"/>
  <trk_unit_roll_ageable trk_unit_id="10005" pro_product_id="10022" sch_prod_order_id="10043"/>
  <trk_unit_roll_ageable trk_unit_id="10006" pro_product_id="10022" sch_prod_order_id="10049"/>
  <trk_unit_roll_ageable trk_unit_id="10007" pro_product_id="10022" sch_prod_order_id="10048"/>  
  <trk_unit_roll_ageable trk_unit_id="10008" pro_product_id="10022" sch_prod_order_id="10043"/>  

From the above message there are three distinct Order Ids:


Therefore the goal is to split the incoming Normalized message into three separate messages,
each with their own unique Order Id and respective line items as below:

Message one:

<Rolls UniqueOrderID=10049>
    <trk_unit_roll_ageable trk_unit_id="10003" pro_product_id="10031" sch_prod_order_id="10049"/>
    <trk_unit_roll_ageable trk_unit_id="10006" pro_product_id="10022" sch_prod_order_id="10049"/>   

Message two:

<Rolls UniqueOrderID=10043>
    <trk_unit_roll_ageable trk_unit_id="10004" pro_product_id="10024" sch_prod_order_id="10043"/>
    <trk_unit_roll_ageable trk_unit_id="10005" pro_product_id="10022" sch_prod_order_id="10043"/>
    <trk_unit_roll_ageable trk_unit_id="10008" pro_product_id="10022" sch_prod_order_id="10043"/>  

Message three:

<Rolls UniqueOrderID=10048>
   <trk_unit_roll_ageable trk_unit_id="10007" pro_product_id="10022" sch_prod_order_id="10048"/>  

One solution to accomplish the above is discussed below :

1) Create an Orchestration to accept the incoming message and split it out into
separate messages:

In the Orchestration:

a) Find the distinct list of Order Ids in the message. One method to do this is to utilize some custom
XSLT in a BizTalk map as below :

<xsl:key name="Code-types" match="trk_unit_roll_ageable" use="@sch_prod_order_id"/>
<xsl:template match="/">
<ns0:Rolls xmlns:ns0="http://ObjectSharp">
<xsl:for-each select="//trk_unit_roll_ageable[count(.| key('Code-types', @sch_prod_order_id)[1]) = 1]">
 <xsl:element name="ProdOrderIds">
  <xsl:attribute name="sch_prod_order_id">
   <xsl:value-of select="@sch_prod_order_id"/>

This XSLT can be used in a map. A method to include custom XSLT in a map is discussed in more detail HERE

The final constructed message (msgDistinctOrdersIds) contains a list of distinct order ids:

  <ProdOrderIds sch_prod_order_id="10049"/>
  <ProdOrderIds sch_prod_order_id="10043"/>
  <ProdOrderIds sch_prod_order_id="10048"/>

b) Use an xpath statement to find a count of Distinct Order Ids.
In this case an xpath statement is used in the orchestration as below:

xpath(msgDistinctOrdersIds,"number(count(/*[local-name()='Rolls' and  namespace-uri()='http://ObjectSharp']/*[local-name()='ProdOrderIds' and namespace-uri()='']))");

In the above example the xpath statement will return a value of three.

c) To create each separate message, a loop shape is used in the orchestration. The number of times to iterate in the loop is the value returned from the
xpath statement of step b).

d) For each iteration in the loop shape:
i) Use an xpath statement to find the current distinct Order Id in the iteration as below:
   varIntSchProdOrderId = xpath(msgDistinctOrdersIds,"number(//ProdOrderIds[" + varStrOrderCounter + "]//@sch_prod_order_id)");
   The message used in the xpath statement is the message created in step a). This is a message with the distinct order ids.
ii)Construct a helper XML message (msgSplitterParameters) that will be used in the map to aid in the splitting.
   In this case this message will contain the unique Order Id of the current iteration. An expression
   shape with the following code is used to create a brand new version of the message in each iteration of the loop.
varXMLDomSplittingParametersMsg.LoadXml(@"<ns0:SplitterParameters ProdOrderID=""0"" OrderCount=""0"" UniqueIdentifier=""""  xmlns:ns0=""http://ObjectSharp"" />");

construct msgSplitterParameters
   msgSplitterParameters = varXMLDomForSplittingParametersMsg;
   msgSplitterParameters.ProdOrderID = varIntSchProdOrderId;
   msgSplitterParameters.OrderCount = varIntTotalNumberOfDistinctOrders;
   msgSplitterParameters.UniqueIdentifier = varStrUniqueIdentifier;

Note: A discussion of how to construct messages in an orchestration is HERE

e) For each iteration of the loop, Use a map in the orchestration and use the original input message to create a (split) 
message with that unique order id:

A map with two inputs (two xml messages) is used to create the final output message. The first source used in the map is the original input message with all order items.
The second source is the parameter message constructed in part d). The whole trick of this implementation is to change this parameter message through each iteration of the loop.
Each iteration will produce the correct split message.

The map is quite simple. The first input is the original input message to create each split message  with one unique order id. The second message (parameter message) is used to help filter out the unique line items with that order id in the first input of the map. A logical Equal functoid in the map is used to do the actual filtering.

For example for the first iteration of the loop in the above example, one of the values in the
parameter message is set to:

msgSplitterParameters.ProdOrderID = "10049"

Then when the map is invoked in the loop, the following split message will be produced :

<Rolls UniqueOrderID=10049>
    <trk_unit_roll_ageable trk_unit_id="10003" pro_product_id="10031" sch_prod_order_id="10049"/>
    <trk_unit_roll_ageable trk_unit_id="10006" pro_product_id="10022" sch_prod_order_id="10049"/>   

Again the full sample can be downloaded HERE
Read the ReadMe.txt file in the zip file, before unzipping and installing.

Note: That the solution also contains a Normalizer Orchestration and
Aggregator Orchestration that are discussed in other blog entries.

Note: Many methods exist to split messages in BizTalk,
such as using Envelopes, Custom XSLT, XPATH, etc.
Please check out some of the other methods HERE , HERE, HERE, HERE

BizTalk 2004 -> Testing Orchestrations using Nunit, Submit BizTalk Adapter and a Normalizer Pattern

To create a Nunit test of a single orchestration or a series of orchestrations is not a natural process.
I came up with this method for testing Orchestrations using Nunit when I was working on a BizTalk project. The client required that Nunit tests be performed on the orchestrations that were developed. It is also part of the first demo of a WebCast I did on March 28 2005 called: BizTalk Server 2004 Implementing Enterprise Integration Patterns. It can be viewed HERE as a recorded Webcast.
In the below example a Normalizer pattern Orchestration is being tested. The full BizTalk solution can be downloaded at the end of this blog.
The below details are somewhat involved, but if you download the above solution and install, analyze and run the BizTalk demos, it will become clear.

Getting back to the project. I was applying Patterns to this particular integration solution. This in turned produced a more modular type of solution with a Normalizer (this is a pattern) Orchestration and a Message Aggregator (this is another pattern) Orchestration. Because the solution was more modular in nature, it also made it easier to test. Additionally this method uses a Test harness Orchestration to asynchronously call the Normalizer Orchestration. In the production environment the test harness Orchestration would be replaced by another orchestration.

This testing method will not work for every type of Orchestration scenario, but works best if the Orchestration is modular in nature (i.e. implements some pattern). Even if you do not use this method for testing Orchestrations, this example shows:
1) A Nomalizer Pattern (two different BizTalk implementations are discussed).
2) How to pass multi-part messages between orchestrations.
3) How to conditionally turn an asynchronous call to another orchestration into a Synchronous Call.

The basic flow to test the Normalizer Orchestration is to:
1) Use .NET code in a Nunit Test Harness to produce a XML message that will ultimately be passed to the Normalizer Orchestration.
2) The .NET Code then calls the BizTalkMessaging.SubmitSyncMessage method to submit the XML message
to a Request-Response Receive Physical Port that uses the Submit Adapter. Because the SubmitSyncMessage method is used, the code will then wait for a response back from the Receive Port.
3) The Receive Port then publishes the incoming message from the Nunit test harness into BizTalk.
4) A Test harness Orchestration then subscribes to the message published by the Submit Adapter. This Orchestration will then use the Start Shape and asynchronously call the Normalizer Orchestration passing a couple of parameters, one containing the XML message for the Normalizer to work on and the other
being a Self Correlated Port.
5) The Normalizer Orchestration will then process the message and then construct a multipart message to be sent back to the Test Harness Orchestration. The Normalizer Orchestration will then call back to the Test Harness Orchestration using the Self Correlated Port passed in as a parameter.
6) The Test Harness Orchestration then constructs a response message using the results from the Normalizer Orchestration. This message is then passed back to the Nunit test harness code, using the the Logical Request-Response logical port in the Test Harness Orchestration.
7) The Request-Response Receive Physical Port described in 2) will then send the response message back to the .NET Nunit Test Harness code via the BizTalkMessaging.SubmitSyncMessage call.
8) The .NET code in the Nunit Test Harness can then interrogate the message sent back, and perform some tests on this message.

Therefore the important pieces to this are:
1) .Net code in the Nunit Test Harness
2) The Submit Direct Adapter
3) A Test Harness Orchestration. In a production environment, this orchestration would be replaced by another orchestration that will call the Actual Orchestration to be tested as in 4).
4) The Actual Orchestration being tested, in this case the Normalizer Orchestration.

To go into more detail:

1) In a Nunit test harness, submit a message
to BizTalk. (Note: this is not the full code)

string submitURI = "submit://MessageNormalizer";
btm = new BizTalkMessaging();
System.Xml.XmlDocument xmlDomRollsFromMillOne = New System.Xml.XmlDocument();
IBaseMessage responseMsg = null;
responseMsg = btm.SubmitSyncMessage(btm.CreateMessageFromString(submitURI,xmlDomRollsFromMillOne.OuterXml));
// Note: Because using the SubmitSyncMessage method, this code will now wait for the response message coming back.

Nunit 2.2.0, can be downloaded HERE

SubmitDirect BizTalk adapter can be installed by running ->
C:\Program Files\Microsoft BizTalk Server 2004\SDK\Samples\Adapters\SubmitDirect\Setup.bat
Also read about the SubmitDirect Adapter sample HERE

For users of BizTalk 2000 and BizTalk 2002, the SubmitDirect adapter replaces the BizTalk 2000 and 2002 API calls Submit and SubmitSync. The SubmitDirect Adapter allows the submittal of messages to BizTalk Server programmatically. Therefore you can write some C# or VB.Net code that can submit a message to BizTalk server and optionally receive back a response.

2) Create a Request-Response Receive Physical Port that uses the Submit Adapter as below:

3) Create some common orchestration types to help in the testing:

a) Multi-part Message Type: mpMsgTypeRolls
(This multipart is used to pass information to the Normalizer Orchestration)
mpMsgTypeRolls contains Message Parts:
i) msgCallBackToOrchestrationFlag. This flag is set to "1" if the Normalizer Orchestration is to call back
to the calling Orchestration. Therefore in the development/testing environment this flag is set to "1".
In the production environment this flag is set to "0". This is so I can use the Normalizer Orchestration
in the production environment without modifying anything and just changing the passed flag to "0"
ii) msgXMLRolls. This is a string of XML that contains the Roll Production Orders.

b) Multi-part Message Type: mpMsgTypeTestHarnessResults
(This multipart is used to pass information from the Normalizer Orchestration back to the test harness orchestration.)
mpMsgTypeTestHarnessResults contains Message Parts:
i)   msgErrorMessage. This is a string that will be set with any error message.
ii)  msgNumberOfRollsSplit. This is populated by Message Splitter Orchestration
iii) msgXmlRolls. This is returned with the mapped normalized message.

c) Multi-part Message Type: mpMsgTypeInvalidMessage
(This multipart is used to pass information from the Normalizer Orchestration, to the Invalid Message Orchestration.
mpMsgTypeTestHarnessResults contains Message Parts:
i) msgInvalidXMLMessage. This contains the invalid xml that the Normalizer Orchestration cannot parse.

d) Port Type: PortTypeTestHarnessResults
This port type is used to create a port that will be passed as a parameter into the Normalizer Orchestration.
The Normalizer Orchestration will then call back on this port to the calling orchestration.

4) The test harness orchestration then receives the message from the Nunit test harness via the Submit Receive Port, and then in turn calls the Normalizer Orchestration with the following parameters:
i) mpMsgRollsRequest. This multipart message is populated with the XML message to normalize and a flag to
indicate to call back to this particular orchestration.
ii) PortReceiveBackTestHarnessResults. Port of type PortTypeTestHarnessResults. This is a Direct - Self Correlating Port.

The Normalizer Orchestration can then call back to the test harness orchestration even though the
test harness orchestration called the Normalizer Orchestration asynchronously. A good discussion of Direct
Port Binding Types can be found HERE 

5) The Normalizer Orchestration then receives the information from passed parameters and
then Normalizes the message. Because an untyped message is passed to this orchestration, it must first determine the map to invoke. Therefore a property on the message is interrogated to determine the type of message :
varStrMessageType = msgIncomingRolls(BTS.MessageType);
varStrMessageType is then populated with the correct message type such as -> http://RollsFromMillOne#Rolls.
This is of course the TargetNameSpace#RootNodeName combination of the incoming message. A decision shape in this orchestration will then decide what map will be used to Normalize the message based on the contents of the Message Type. If the message is unrecognized, then the InValid MessageChannel Orchestration is called. (This is another pattern).

6) Once the Normalized message has been processed, the Test Harness Orchestration is called back from the Normalizer Orchestration using the port PortCallBackResults that was passed as a parameter by the calling test harness orchestration. This is how an asynchronous call can be conditionally turned into a synchronous call.

7) The Test Harness Orchestration then calls back to the to the Submit Direct Test harness code in 1).This code, can then perform tests on the results sent back.

For a comprehensive Nunit Test Framework please go HERE

Another Normalizer Pattern

A much simpler method to create a Normalizer pattern in BizTalk is by specifying a Set of Maps in
a Physical Receive Port or Physical Send Port as below:

Depending on message type of the incoming message, one of the following maps will be invoked
producing the normalized message:

MessageType Map Invoked
http://RollsFromMillOne#Rolls Map_RollsFromMillOne_To_RollsInternal.btm
http://RollsFromMillTwo#Rolls Map_RollsFromMillTwo_To_RollsInternal.btm
http://RollsFromMillThree#Rolls Map_RollsFromMillThree_To_RollsInternal.btm

In the above scenario, no orchestrations are required and all the necessary mapping can be
accomplished in Physical Receive or Send Ports

Again the full sample can be downloaded HERE. To install and run, Read the ReadMe.txt file in the zip file, before unzipping and installing. Also included with the sample are a Splitter Pattern Orchestration and an Aggregator Pattern Orchestration that will be discussed in future blog entries.


BizTalk Server 2004 and Sql Server 2000 -> XML Auto and XML Explicit

BizTalk 2004 provides support for processing XML documents returned from Sql Server
via the Sql Adapter that ships with BizTalk Server 2004.

For example:

Select *
From Customers
For XML Auto

will produce an XML document that can be processed by BizTalk 2004 received via the Sql Receive Adapter.

The BizTalk documentation states that only the XML Auto clause is supported and
not the XML Explicit clause. But in fact BizTalk can process Sql Server XML documents that
are generated using the XML Explicit clause (this will be explained below).

If you have not used the Sql Adapter before, try the following two examples in the BizTalk SDK:

Using the SQL Adapter with a Stored Procedure in an Orchestration

SQL Adapter (BizTalk Server Sample)

For BizTalk to interact with an XML document originating from Sql Server,
one of the first steps is to create a xsd schema that describes the
XML returned from a Select statement.
This xsd schema can be automatically generated in a BizTalk project by Adding a Generated Item, as outlined
HERE  This is only if the Select statement has an XML Auto clause (and XMLData clause, that is removed after the
schema is generated). If a Select Statement with a XML Explicit clause is used to generate the
schema, the xsd schema generation will fail with an error such as : The required attribute 'name' is missing.
An error occurred at , (0,0). Or : Failed to execute sql Statement. Please ensure that the supplied syntax is correct.

This is possibly why the BizTalk documentation states that the XML Explicit is not supported.
But again, BizTalk can use XML returned by XML Explicit (explained below).

So why use the XML Explicit clause over the XML Auto clause? Simply because the XML Explicit clause
gives you much more control of the XML structure that is returned by Sql Server compared to
the XML Auto clause.

For example, an Order is associated with a Shipper, Customer and Order Items. 
The following Select Statement using XML Auto:

Select [Order].OrderID,
 Shipper.ShipperID as ShipperID,  
 Shipper.CompanyName As ShipperName,
 Customer.CompanyName as CustomerName,
 OrderDetails.Quantity as Quantity,
 OrderDetails.UnitPrice as UnitPrice,
 OrderDetails.ProductID as ProductId  
From Orders as [Order]
 Join Shippers as Shipper
 On [Order].ShipVia = Shipper.ShipperID
        Join Customers as Customer
   On [Order].CustomerID = Customer.CustomerID 
        Join [Order Details] as OrderDetails
 On [Order].OrderID = OrderDetails.OrderID 
For Xml Auto

Will return XML in the format of:

 <Order OrderID="11077">
  <Shipper ShipperID="2" ShipperName="United Package">
   <Customer CustomerName="Rattlesnake Canyon Grocery">
      <OrderDetails Quantity="24" UnitPrice="19.0000" ProductId="2" />
      <OrderDetails Quantity="4" UnitPrice="10.0000" ProductId="3" />
 <Order OrderID="11076">
  <Shipper ShipperID="2" ShipperName="United Package">
   <Customer CustomerName="Bon app'">
      <OrderDetails Quantity="20" UnitPrice="25.0000" ProductId="6" />
      <OrderDetails Quantity="20" UnitPrice="23.2500" ProductId="14" />
      <OrderDetails Quantity="10" UnitPrice="9.2000" ProductId="19" />
  Note: That the top level parent node Orders is added by the Sql Adapter

The above format is not desired because the parent node
of the Customer node is the Shipper node and the parent node of the
OrderDetails is the Customer node. In this case the desired format of the
XML message will have the Order Node being the parent node of the
Shipper, Customer and OrderDetails nodes as below:

  <Order OrderID="11077">
     <Shipper ShipperID="2" ShipperName="United Package"></Shipper>
     <Customer CustomerName="Rattlesnake Canyon Grocery"></Customer>
       <OrderDetail Quantity="24" UnitPrice="19.0000" ProductId="2" />
       <OrderDetail Quantity="4" UnitPrice="10.0000" ProductId="3" />
  <Order OrderID="11076">
    <Shipper ShipperID="2" ShipperName="United Package"></Shipper>
    <Customer CustomerName="Bon app"></Customer>
       <OrderDetail Quantity="20" UnitPrice="25.0000" ProductId="6" />
       <OrderDetail Quantity="20" UnitPrice="23.2500" ProductId="14" />
       <OrderDetail Quantity="10" UnitPrice="9.2000" ProductId="19" />

To return the XML in this (above) format, the below Select statement with a XML Explicit clause is used:

SELECT  1     as Tag,
        NULL  as Parent,
 Orders.OrderID as [Order!1!OrderId],
 Null as [Shipper!2!ShipperID],
 Null as [Shipper!2!ShipperName],
 Null as [Customer!3!CustomerName],
 Null as [OrderDetails!4!OrderId!hide],
 Null as [OrderDetail!5!Quantity],
 Null as [OrderDetail!5!UnitPrice],
 Null as [OrderDetail!5!ProductId]
From Orders
 Join #OrdersToGet
 On #OrdersToGet.OrderID = Orders.OrderID
Union All
Select  2,
From Shippers
 Join #OrdersToGet
 On #OrdersToGet.ShipVia = Shippers.ShipperID
Union All
Select  3,
From Customers
 Join #OrdersToGet
 On #OrdersToGet.CustomerID = Customers.CustomerID
Union All
Select  distinct 4,
From [Order Details] as OrderDetails
 Join #OrdersToGet
 On #OrdersToGet.OrderID = OrderDetails.OrderID
Union All
Select 5,
From [Order Details] as OrderDetails
 Join #OrdersToGet
 On #OrdersToGet.OrderID = OrderDetails.OrderID
Order By [Order!1!OrderId],Tag,[OrderDetails!4!OrderId!hide]
For XML Explicit  

Note: The above Sql Statement is part of a stored procedure that
can be downloaded at the end of this blog.

The XML Explicit clause is used with Unions of Select statements
and the keywords -> Parent and Tag to generate the desired result.
The Order By clause is also very important, when using XML Explicit.
It's a good idea to  take off the XML Explicit clause to view the regular sql result set (especially if the
format of your returned XML is incorrect or to help debug).
A good description of the XML Explicit clause with some examples is as below:


Below describes how to create a BizTalk xsd Schema for XML returned using a XML Explicit Sql Statement:

1) Create and test (perhaps using Query Analyzer) a XML Explicit Sql Select statement
that includes the Tag, Parent and Unions of Select Statements.

2) Create a Sql Receive port that will invoke the Select Statement (this could be a stored proc)
that contains the XML Explicit clause. This Sql Receive Port should use the PassThrough Pipeline.
Note: in the Receive Location properties you must include the namespace of the
generated XML, Root Node name, Select Statement or Stored Procedure. In this example it was :
Document Target NameSpace : http://ObjectSharp/Orders
Document Root Element Name: Orders
Sql Command: exec BTSGetOrders 1000

3) Create a File Send Port that subscribes to the Receive Port, for example create
a filter on the Send Port with BTS.ReceivePortName = "Name of Receive Port Here"
This Send Port should also use the PassThrough Pipeline.

4) Turn on the above two ports and let a sample message be placed into the File Location
of the Send Port.

5) Create a new BizTalk Project or use an existing one and add a Generated Item. Use the Generate
Schemas option and Select -> Well-Formed XML. Select the xml file produced in step 4).
A xsd schema is now generated that can be used in a BizTalk Solution. If this BizTalk
project is deployed, the Receive Port and Send Ports created in Step 2) and 3) can now use the
standard XML Pipelines.
Note : It may be necessary to modify the xsd schema to alter the attribute types
min max properties etc.

6) The produced xsd schema in 5) can now be used in a BizTalk Solution for Mapping, Messages etc.
Orchestrations can subscribe to xml messages that are derived from a Select Statement that uses
the XML Explicit Clause.

Conclusion: Select Statements using the XML Explicit clause can generate XML documents
that can be consumed by BizTalk Server 2004.

Download the Sample Stored procedure and XSD file HERE

Note : The sample uses tables found in the NorthWind Database that is optionally installed with
Sql Server.

BizTalk Server WebCast -> BizTalk Server 2004 Implementing Enterprise Integration Patterns

I am doing a WebCast on Mar 28 2005  :

BizTalk Server 2004 Implementing Enterprise Integration Patterns  

Description:   This intermediate level presentation will discuss how to create various Enterprise Integration Patterns in BizTalk 2004. Discussions and demonstrations will include patterns such as: Message Translator, Message Broker, Message Splitter, and Message Aggregator.

To sign up :

(I have updated the link to the recorded webcast. Matt May 10 2005).


Bloggers Guide To BizTalk, Dec 2004 Edition

If you are working on a BizTalk 2004 project, please follow the below link to download this extremely useful source of information, kindly provided by Alan Smith :

BizTalk 2004 Map, Getting Distinct Values using an Inline XSLT Call Template

Need to get a distinct list of items from an XML message.

For example the XML message containing Order Items is as below:

(Note: That the Order ids are not sorted)

From the above message produce a distinct list of order ids such as below:

The custom XSLT code to get the distinct list of values looks like this:

To use an Inline XSLT Call Template inside of a BizTalk 2004 Map,
a) Create a new Map inside of your BizTalk project
b) Choose the source and destination schemas
c) Drop a Scripting Functoid on the Map, then in the properties window with
the Scripting Functoid selected,  press the -> Configure Functoid Script button.
d) For the Script type choose -> Inline XSLT Call Template. Place the XSLT into the Inline Script Buffer

The map to get the distinct Order Ids looks like the below:

Note : The scripting functoid inside of the map has no incoming or outgoing links.
The Custom XSLT will handle the complete transformation. Ignore any warnings about no incoming
or outgoing links when building the project.

Download the Complete solution HERE