Monday, December 31, 2007

Integrating MOSS, WF and Infopath

I ventured into relatively "unchartered" territory of SharePoint world. I have been working on setting up a a framework involving MOSS, Windows WorkFlow and Infopath 2007. Business requirements are pretty standard. Client wants to set up a business process for "New Hire". When a new hire joins in certain business processes need to be fired. Like creating domain accounts, allocating hardware and setting up the user in different databases. Some of the processes will be executed in series and others in parallel.

Goal is to set up a scalable architecture that can potentially hook up with Biztalk in near future. Essentially we are setting up "Business Process Management (BPM)" system. Recently my company established a partnership with a company (PNM Soft) that is one of the leaders in "BPM" software. I evaluated their software to be in a position to recommend it to my clients. It is a very powerful and easy to use software. Does not require much of coding and even a pure business user can design complex BPM processes using it.

My ongoing work in developing a scalable framework for my client using MOSS and other tools will also help me in determining the value added by the "BPM" software with respect to designing a BPM process from scratch using standard MSFT technologies.

Solution:


1) 2 Infopath forms ( First form is a brower enabled form that is published on "sharepoint", Second Form is infopath Task form that is published on "network"). I tried to get "Single" form to do both. But I ran into problems. Sharepoint won't let me publish the same form as "workflow" attachment and also as "sharepoint" enabled form. The second form is actually what you will call "Task Item" Form. It is one of the 4 forms that Workflow uses.

2) Workflow that is attached to a form library. Users publish the "first" form to the form library that triggers the "Workflow". This work flow is "sharepoint" sequential workflow.A good working demo can be found at:

http://weblog.vb-tech.com/nick/archive/2007/02/25/2207.aspx

Following are some helpful tips:


1) Cloning an infopath to create a second form is not a good practice. Infopath has a unique internal ID that will also copy over to new form and can create issues when both forms are used in a WF set up. I tried using one form as an Initiation form and second as Task Item form but could not get it working. I came across issue in passing parameters from work flow to the "Task Item" form. I am not aware of any workaround. But I think it is a good practice to create forms from scratch. I will not recommend the "schema first" approach for same reasons.

2) Make sure that the "network" published infopath forms (*.xsn) listed under Metadata section of "workflow.xml" (part of WF solution) exists and are under the correct folder location. Delete the references of forms not being used from "workflow.xml". Else you may get error like "Form is closed".

3) If you are using VSTA to code infopath forms. Make sure that the "dll" file is also part of the folder where *.xsn forms are located. Else you may be get an error " Type: InfoPathLocalizedException, Exception Message: The specified form cannot be found". Another thing that is to be kept in mind is use of "FormState" to define global variables. For browser enabled forms, there is need to preserve state across browser sessions. "FormState" helps you do that. The other option is to use workflow to pass the values into the "resource" file that is part of the infopath form. This "resource" file serves as "Secondary" data source and accepts values from Workflow and then updates the Infopath (Task form). I like this option better because using WorkFlow to maintain the state ensures that the "infopath" form is updated only after the workflow is executed. Thus making
sure that "state" the infopath work represents is actually the state that workflow
indicates. There is no need to use "FormState" property in latter case.

Example: Suppose user clicks on a check box on "Infopath Task form" indicating that the particular step is complete and workflow should flow to next step. If one preserves this state (of "check box" being clicked") using "FormState" then even if the workflow gives error the "checkbox" will still show as clicked and user will get an impression that the process is completed successfully as "check box" is clicked. But having workflow update this "checkbox" using "Extended Properties" ensures that this check box preserves the right state only after the workflow has done what it was suppose to do.

I noticed that order of populating "secondary data source" from workflow does matter. If the order is not correct the values will not propogate from workflow to secondary data source to Infopath form.

Check out the image below:

Note: The order of fields in "secondary" data source schema matches the order in which WorkFlow assign values.


Use "Trusted Security" while publising forms with embedded VSTA code.

PS: VSTA and VSTO carry the same weight when used for "broswer" only forms.

4) Windows Worfklow dll needs to be GACed but the dll for VSTA code is not required to be GACed. But it needs to be copied to the same folder on network where the forms are published.

PS: Suppose you create a infopath form with embedded VSTA code and then you remove the code before "publishing" it. You still need to "copy the dll" file in the correct folder containing .xsn form. Else you will get "Form have been closed" error. I found this hard way!

5) Everytime infopath form is changed it should be republished to the network. Also updated dll should be copied. Unistall/Install the feature. Workflows that are already in progress will be updated with the new changes. Keep this is mind before applying the changes. Accordingly wait for "In Progress" workflows to finish.

6) There is no need to RE GAC the workflow dll if only infopath forms are changed. Also it is important to restart the IIS after GACing any new version updates. Restart ensures that the new changes come in effect.

7) Do not try to change the Unique ID of form by going to "File----> Properties" in design view. It seems that there is more than 1 place where this unique ID is referenced in the form. This is the same ID that is referenced in "workflow.xml" and indicates to the workflow which form should be opened as Task Form, Initiation Form, Association form and Modification
Form.

8) "check box" control in Infopath form can be changed from "Boolean" data type to "Text" data type and can be used to accept updates from workflow and this also gives more flexibility to
program it using VSTA and VSTO.




To Do this:




Go to Design View -----> DataSource-----> Double Click on the "check box control" ----->Following type window will "pop-up"----->Change Date Type from "Boolean" to "String"---> This will enable the icon "Fx" and then click there. Now you can select the value that you want to pass from "Secondary Data Source" . This seconday data source will be configured to receive the XML from workFlow







Also, double click on "check" box in form view and make the properties for the "check box" look like.
Now when you pass values of "true" or "false" as string datatype from workflow, the check box will accept them.



9) Use of Initiation form gives easy access the XML data stream that is following into WorkFlow. XML stream can be accessed through workflowProperties.InitiationData. I did not want to use Inititiation form as I wanted to "automatically" start WorkFlow when something new is published or something existing changes in Form Library. To do so you need to use following code in "WorkFlow Activity activated shape"


SPWeb thisWeb = new SPSite(workflowProperties.SiteId).OpenWeb(workflowProperties.WebId);

SPListItem thisItem = thisWeb.Lists[workflowProperties.ListId].GetItemById(workflowProperties.ItemId);


byte[] fileBytes = thisItem.File.OpenBinary();
UTF8Encoding encoding = new UTF8Encoding();
String xmlString;
xmlString = encoding.GetString(fileBytes);
xmlString = xmlString.Trim();

// Desiralization. InitForm is class generated from schema of incoming message

XmlSerializer serializer = new XmlSerializer(typeof(InitForm));
XmlTextReader reader = new XmlTextReader(new System.IO.StringReader(xmlString));
InitForm initform = (InitForm)serializer.Deserialize(reader);


At the end, I concluded that the BPM software is very efficient when it comes to designing pure BPM solutions. MSFT tools are definitely powerful in complex B2B scenario and integration with other systems. But they are "overkill" for simple BPM scenarios.

Wednesday, August 29, 2007

Implementing WCF

Why WCF?

WCF aka Windows Communication Foundation services is latest product from MSFT for creating Distributive Applications. Year 2007 has been buzzing with new product releases on a weekly basis. It is sometimes hard to keep track of latest releases. I can speak for myself that I end up missing some new releases now and then. It is tough to keep one eye on client work and another eye on MSFT press releases. Now it has become part of job. Working in consulting sector is a 24/7 job. When you are not working with client, you are scanning different websites looking for bits and bytes on latest product releases and technology trends. I do love my job but sometimes it gets really tough to keep up with the pace of technology changes. Sometimes you have to be like Tiger Woods and chose to leave. If only everyone was as good as him!

Distributive Technologies have become center of turf war for technology companies. Webservices is an important piece in this war. Turf is dotted with companies like Java and also open source products like Tungsten.

WCF is a further enhancement of ASMX webservice released with .Net framework 2.0. WCF is part of .net framework 3.0 version. Version 3.5 is out too in form of beta version. WCF lets users get benefit of ASMX , WCE and messaging services as one product. WCF supports MSMQ, HTTP and TCP protocol. ASMX webservice only supports HTTP. Additional transports can be configured to WCF. WCF also offers configuration editor with GUI for monitoring performance, tracking messages and errors. This tool helps in reducing operation cost with out of box monitoring and tracing of data. Details at the end.

It is highly flexible and configurable solution but does have some drawbacks as compared to .Net remoting. Especially when it comes to running multiple service host instances under single host. .Net remoting actually lets you add service instances. WCF can be run as self-hosting service, IIS or WAS. While running WCF as self hosting service, check the Task manager window. WCF service will show up as Services.exe. There will be only one instance of this service at one given time. Multiple instances are not supported.






Demo:

Used MSFT base sample, Organized it and added personal notes and updates
1) Create a Blank Solution call WCF
2) Add a Project and Call it “Host”
3) Add a C# library file and call it “Host.cs”
· Add reference to System.ServiceModel.dll
4) Add another Project and call it “Client”


I ) Create WCF Host


5) Define Service Contract
// Define a service contract.
[ServiceContract(Namespace = "http://Microsoft.Demo.WCF")]
public interface IMath
{
[OperationContract]
double Add(double n1, double n2);

}
Attribute “ServiceContract” marks the Interface to carry metadata for WCF.
Attribute “OperationContract” is equivalent to “Public” keyword. Client can access the service.
PS: Since the datatype used in simple we have not explicityly defined “DataContract”. For complex datatyype it is important to define DataContract explicitly. It is also a good practice to keep “ServiceContact” and “DataContract” as separate implementation.

6) Implement the service

Implementing service contract is a rather easy process. Just use a C# (VB) class and implement the service interface. This is the part where the logic behind the service is implemented.

//Implement the service
public class MathService : IMath
{
public double Add(double n1, double n2)
{
double result = n1 + n2;
Console.WriteLine("Return: {0}", result);
return result;
}

7) Create Host for the service


Till now the service is in form of library class. We will need a host to host the service. As mentioned earlier, there are 3 options:

a. Create Self Host
b. Use IIS
c. Use WAS

In this demo we will create our own host because it is easy to Debug. Before we create a host, we need to define the address where the service will reside.

This step will contain following five steps:

d. Create a base address for the service.
e. Create a service host for the service.
f. Add a service endpoint
g. Enable metadata exchange.
h. Open the service host

// Step d
Uri baseAddress = new Uri("http://localhost:8000/DemoWCF/Service");

// Step e
ServiceHost serviceHost = new ServiceHost(typeof(MathService), baseAddress);
try
{
// Step f
serviceHost.AddServiceEndpoint(
typeof(IMath),
new WSHttpBinding(),
"MathService");

// Step g
ServiceMetadataBehavior smb = new ServiceMetadataBehavior();
smb.HttpGetEnabled = true;
serviceHost.Description.Behaviors.Add(smb);

This step is required to run the svcutil command later on to download 2 client files (config and class). Otherwise user may get following error:

“ There was no endpoint listening at http://localhost:8000/DemoWCF/service that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details.
The remote server returned an error: (404) Not Found.”

This step is optional if user decides to manually create both client files.

// Step h
serviceHost.Open();
Console.WriteLine("The service is ready.");
Console.WriteLine("Press to terminate service.");
Console.WriteLine();
Console.ReadLine();
serviceHost.Close();

II) Create WCF client


8) Run svcutil command to generate client class and config file

C:\Program Files\Microsoft Visual Studio 8\Common7\IDE> svcutil /language:cs /out
:generatedClient.cs http://localhost:8000/DemoWCF/service

2 generated files will be : output.config and generatedClient.cs.

9) Add both of these clients to “Client” Project. The complete solution will look something like the imaage below. Note the added references. Rename them to app.config and MathClient.cs



10) Add the following code to Client.CS

namespace Microsoft.Demo.WCF
{
class Client
{
static void Main()

{

try
{
//Create an EndpointAddress instance for the base address
This Endpoint address should be concatenation of baseAddress variable in Host class and stringaddress paramter in AddEndPoint method of Host class.



EndpointAddress epAddress = new EndpointAddress("http://localhost:8000/DemoWCF/Service/MathService");
MathClient client = new MathClient(new WSHttpBinding(), epAddress);

//Call the contract implementation in Host class from Client class.
double value1 = 29;
double value2 = 16.00;
double result = client.Add(value1, value2);
Console.WriteLine("Add({0},{1}) = {2}", value1, value2, result);


//Close WCF client
client.Close();

Console.WriteLine();
Console.WriteLine("Press to terminate client.");
Console.ReadLine();
}

This is simple scenario where we have single service contract , single implementation and single endpoint.

Scenario:

A new client comes in and wants to a new service contract called “Subtract”.

Two ways of doing this:

1) Add the new service “Subtract” in existing interface IMath
2) Create a new Interface IMath1 for “Subtract”


Add the new service “Subtract” in existing interface IMath

Most simplest of updates. But it does not gives us seperation between both as it is part of same interface. Implementation is still with same class and same service host and endpoint. Only thing that change is on the client side where the client calls “client.subtract” just like “client.Add” in the demo above.

Create a new Interface IMath1 for “Subtract”

Create a new interface say IMath1 that contains “Subtract”. Create a second class say “MathService1”. We can’t use the same class to implement 2 interfaces under the same namespace, so we will need to create a new class to implement the new interface (service contract). Another problem that comes with this is the need to have 2 service shosts. Each service host is tied to unique class (service contract implementation). Since we are using “self created” host we can’t create 2 service hosts to implement 2 unique classes.
This is one of the drawback of WCF. Unlike .net remoting where we can add multiple instants of service host under single host, we can’t do same in WCF.
Using IIS as host can help solve this issue.

Key Points

1) Each WCF service class implements at least one service contract, which defines the operations this service exposes.
2) For distributing different set of service to multiple numbers of clients, class can implement multiple endpoints each with a unique service contract.
3) Minimum 1 end point for each service host.
4) Each service host is tied up to only one implementation of contract (class)
5) To expose multiple services, you can open multiple service Host instances within the same host process.
5)
6) Although you can have multiple endpoints with a single service but only service type.
7) You can also dynamically create service with different implementing types. This can be achieved by dynamically creating new Service Hosts.
8) Each endpoint should have unique relative address




WCF Configuration Editor




This tool can be accessed from Visual Studio interface. Go to Tools ---->WCF Configuration Editor.






Read the help file for details. In summary:


1) Open the config file that was generated after running svcutility. This will populate different placeholders in GUI with relevant data. Click on EndPoint folder to view existing configuration. This information can be changed for the GUI without even touching XML file





2) Click on Diagnostics folder and use Toggle switch to browse through various settings. Check out help file for details.



3) Closing the editor after changes will update the changes in the original app.config that was opened in first step. If we open config file, we see additional tags added to perform all the functions that we proposed in GUI.

4) Click on Host.exe file to start the service. Click on client.exe to start the client. All the events will be logged in file(s) whose location(s) can be set from "Diagnostic" folder in GUI.

Tuesday, August 21, 2007

Rules for Successful BAM implementation

I learnt from my experience that by observing following rules BAM execution can become fairly smooth: Some of the steps are not "need to have" but rather "good to have".

1) Install SQL 2005 sp2
2) Ensure Analysis Services is installed (Else cubes/aggregations can't be created)
3) Do not forget to Configure BAM tools to "Enable Analysis Services for Aggregation" using Biztalk config tool. (Else cubes/aggregations can't be created)
4) Ensure that account being used is a member of the Group that is configured in Biztalk Config tool for BAM portal. (Else views in BAM portal may not be visible to the user )
5) Ensure that the user account under which BAM Application Pool is running is member of Biztalk Isolated Host and IIS_WG group.
6)Run following commands: (Else you may get 401 authentication error)

  • cscript adsutil.vbs SET w3svc/NtAuthenticationProviders "NTLM"
  • setspn -A HTTP/servername domain\account
  • setspn -A HTTP/servername.fullyqualifieddomainname domain\account

domain\account is account under which BAM App Pool is running on IIS.

cscript can be found under: C:\Inetpub\AdminScripts

setspn.exe can be found under the Windows 2003 CD-ROM at \support\tools\suptools.msi


If you view the BAM portal but can't view all the "VIEWS" you created. It is most likely that the account you are using does not have permissions on "BAMPrimaryImport" table. Add that account (i.e, biztalkuser) and then assign the role (view) you want that user to see on BAMPortal. Check the image below:



In the image above. The user (biztalkuser) only have permission for bam_EndToEnd. If you want this user to see other views that you can for example click on "bam_Test1View". User will now be able to see both views on BAM portal. This way you can selectively grant users access to different views(roles).

Please note: SQL 2005 creates Roles. You just have to assign them to the appropriate user. SQL 2005 does not supports concept of "Groups" anymore. Instead use Roles.

Friday, August 17, 2007

Strange behavior of Biztalk 2006 on SQL cluster environment

While testing Biztalk application on server farm mode at one of my client, I came across something interesting. This biztalk application used SQL adapter to connect from orchestration to execute stored proc. All tests for server farm went OK. Biztalk server fail over sequence worked fine. Net scala worked fine for web farm. We ran into issue when we tried failing over SQL cluster from original (initial) node to another. On the original node, everything worked perfect. On failing over to second node, I found that Biztalk application would not connect to the custom database to execute stored procedure. We thought may be it is a connection pool caching issue. We tried restarting the physical biztalk servers and hosts. Nothing worked. Then I thought that may be it is something to do with 32 Bit SQL adapter not working properly on 64 bit cluster (not a very smart assumption on retrospection). We also played with MSDTC settings by reducing the authentication level. Nothing worked. The fact that threw me off was that everything was working fine on original node. We only had one physical copy of entire Biztalk database on cluster and not multiple database instances. So, it did not make sense to me as if why the same database was acting differently with different instances of SQL on each node.

I thought logically for 4 hours and tried different things. Nothing worked. I came back next day and starting thinking out of box or rather illogically. We gave explicit permissions to Biztalk service account on the "custom" database that contained stored procedures. We made Biztalk service account database owner and wolla! it worked! Yeah that simple! We could not understand the behavior but it worked. We thought of doing further post mortem. We took the permission away and failed over to original cluster node. It worked as it was working before. For some reasons Biztalk did not care for permissions while working on original node. We failed over again to second node without the permissions and it did not work again. We granted the permissions and it worked!

We left the solution in that working state! But could not explain the reason for this behavior. We have couple of theories but none of them convincing. Is it a Bug? May be. Is it a cluster configuration? May be.

Tuesday, August 07, 2007

Biztalk and SilverLight

Gloves are off! MSFT is going head on against Adobe. With recent launch of Silverlight, a product that will compete against market leader Flash. MSFT dreams to snatch some serious business from Adobe. With Java version not lagging behind, market has just started heating up.

SilverLight will enable developers deliver rich browser based applications. Great new tool for web developers. Richer out of Box BAM portal is one area that Biztalk can gain from this new development. Rich BAM portal with interactive charts and graphical real time display of data can really grab attention of Business Users. How about adding this on Biztalk 2008 wishlist!

Monday, August 06, 2007

Implementing Business Rule Engine

This weekend I got down and dirty with BRE (Business Rule Engine). It is a powerful tool for managing dynamic business rules. It is very efficient in complex logic scenarios. I thought of playing with it and worked out some scenarios.

Scenario:


Company wants to categorize its workers under different “Bonus” levels to decide on dollar value of annual bonuses. Each bonus level is based on variables like Number of Projects sold, Dollar value of each project sold. Is it a new client or existing one? Other variables like future potential etc can also play in. This business logic can change over period of time and needs to be controlled by Business Analysts/ Management team. That means no complex SQL queries. In real world scenario, this means having a policy that can interact with multiple databases/tables on client side. This policy can be invoked by a webservice . This webservice will pass on objects as facts in the BRE. BRE will then update the specified columns in the database with the correct “Bonus” categories.

Scenarios can range from simple scenarios that take single object as input fact to complex scenarios that take multiple objects.

Proof of Concept :

1) Create a simple database called “TestRules”.
2) Create a table called “Customers”
3) Define 3 columns: Author, ID and Status
4) Create a policy called “Testing”.
5) Under this policy, define a rule “Test1” (Image 1)
6) Define 2 Vocabularies TestBRE 1.0 and 1.2 (Image 2 and 3). Difference between them will be the nature of “Binding”. I will explain that later on.

Email me for complete solution @ shashi_1273@yahoo.com


Image 1:

Image 2 (Database Binding type as DataConnection )





Image 3 (Database Binding type as Data Table/ Data Row )



There are 3 facts in the rule:


1. Name of Author (Coming from Database)
2. Input Object on right hand side of “Contains”. This can be a simple string that gives the lookup value as name of the “Author”. In a complex scenario this can be an XML object that contains multiple Fields for “Author”. These name(s) will be then matched against the “Author” column of database “Customers”.
3. StatuD under the action pane. This will be updated based on above 2 facts.


This policy can then be invoked by a WebService . This webservice will use Policy.execute method after passing the required object parameters. For this Proof of Concept solution we will use following combinations:

1) Passing Single Object as Parameter:

In this case the object passed will be database connection only. The second fact can be a hard coded string i.e. “Shashi”. In 2 object scenario we will use an XML object instead of "string". This XML object will be array of values.

Implementation for 2 object scenario will cover this section too.


2) Passing Two Objects as Parameter:

In this case the object passed will be database connection and a XML document. This XML document can contain multiple fields and one of the fields can be “Author”. This “Author” field can have multiple names. In our sample we have 3 names: Shashi, Kent and Jeff.

a) Database Binding type as DataConnection
b) Database Binding type as Data Table/ Data Row

Both of the above types can be set up while using Wizard to creating Vocabulary. Use DataConnection type if the count of rows returned is more than 10 (http://blogs.msdn.com/biztalkbre/ ). Otherwise use Data Table type. Also for Long Term Facts that are based on Caching, use Data Table/ Data Row Type. Reason being that no additional benefit will be gained by caching Dataconnection object as Connection Pool already does that. Implement IFact Retriever for Long Term facts. For this particular example, we will not use Long Term facts.

Important thing to note is that in case of Database facts ( with binding type Data Table/ Data Row ) and XML facts, Typed objects need to be used. .Net facts do not need any “Typing”. You will see in the implementation how both XML and Database facts use TypedXMLDocument and TypedDataTable respectively.

It is a good practice to create a tracking file to understand how BRE works and this file can also be used as a Debugging tool.

Implementation for Case a)


SqlConnection conn = new SqlConnection("Integrated Security = SSPI;Database= TestRules; Server =(local)"); // establish connection to server

conn.Open(); //open the connection

DataConnection dc = new DataConnection("TestRules", "Customers", conn);

XmlDocument xd1 = new XmlDocument();
xd1.Load(@"C:\BiztalkProjects\BRE\TestSmall.xml");
TypedXmlDocument doc1 = new TypedXmlDocument("Books", xd1);
object[] shortTermFacts = new object[2];
shortTermFacts[0] = doc1;
shortTermFacts[1] = dc; // Note: No Typed dataset being used in this implementation

Policy pol = new Policy("Testing",1,0);
DebugTrackingInterceptor dtraking = new DebugTrackingInterceptor("ShashiOut.txt");

try
{ pol.Execute(shortTermFacts, dtraking);
dc.Update();
conn.Close();
}
catch (Exception ex)

{ System.Console.WriteLine(ex.ToString()); }

Implementation for Case b)


SqlConnection conn = new SqlConnection("Integrated Security = SSPI;Database= TestRules; Server =(local)"); // establish connection to server

conn.Open(); //open the connection

SqlDataAdapter da = new SqlDataAdapter("select * from Customers", conn); // create adapter to fill dataset .

//------explicitly define Update method for database adapter object. Directly using da.update will give error

SqlCommandBuilder commBldr= new SqlCommandBuilder(da);

commBldr.GetUpdateCommand();

da.UpdateCommand = commBldr.GetUpdateCommand();

//...

DataSet ds = new DataSet("TestRules"); // create a dataset

da.Fill(ds, "Customers"); //fill dataset

//....Not required for Single Object scenario
XmlDocument xd1 = new XmlDocument();

xd1.Load(@"C:\BiztalkProjects\BRE\TestSmall.xml"); //xml instance to be passed

TypedXmlDocument doc1 = new TypedXmlDocument("Books", xd1);

//............

TypedDataTable tdc = new TypedDataTable(ds.Tables["Customers"]);

// Only pass one fact (tdc) for single object scenario (First scenario)
object[] shortTermFacts = new object[2];
shortTermFacts[0] = doc1;
shortTermFacts[1] = tdc;

Policy pol = new Policy("Testing",1,0); //Major and Minor Versions

DebugTrackingInterceptor dtraking = new DebugTrackingInterceptor("ShashiOut.txt"); //writes the debug file under the "bin" folder.

try

{ pol.Execute(shortTermFacts, dtraking);
da.Update(ds,"Customers");

conn.Close();

} catch (Exception ex)

{ System.Console.WriteLine(ex.ToString()); }

}

}

}

Take Away :

1) While testing a Policy, BRE create Database connection/objects. While executing the policy from .net class, database objects need to be explicitly created.

2) It is a good practice to create tracking file while executing policy. This can help to troubleshoot issues while development. It can also be compared with the tracking log that is created by Business Rule composer when Testing a policy. This way we can observe the list of parameters that policy is expecting and make sure that these parameters are correcly passed while executing the same policy from outside Rule Composer. By comparing these 2 logs, I was able to figure out when policy is expecting TypedXMLDocument and TypedDataTable.

3) DataSet object and XML document have to be passed as "Typed".

4) The 2 Object scenario can be extended for multiple Objects.

5) I know it is very painful to copy and create new versions of policy whenever any modifications are required. Same applies to vocab. Following SQL commands can ease that pain by undeploying the policies/vocabs and allowing users to do modifications. It is highly recommended that these commands are used only in Development phase.

For Policies:

declare @RuleSetId Int
select @RuleSetID =nRuleSetID
FROM re_Ruleset
WHERE strName = 'Testing'
AND nMajor=1 and nMinor=0
UPDATE re_ruleset
SET nStatus =0
WHERE nRuleSetID = @RuleSetID
DELETE FROM re_deployment_config WHERE nRuleSetID=@RuleSetID
DELETE from re_tracking_id WHERE nRuleSetID=@RuleSetID

For Vocabs:

UPDATE re_VocabularySET nStatus = 0 /* to republish make it back to 1 */
WHERE strName = 'Vocabulary1' AND nMajor =1 AND nMinor=0

I think BRE is a very powerful tool if used properly. Key is to use it for complex scenarios that can leverage functions like Update, Assert, Retract. Using it for simple queries is an example of under utilization of tool. In terms of performance, BRE will work better than traditional SQL query in case of complex scenarios.

It is not truely a Business User tool. Concepts of version, vocabulary and facts are not easy for a business user to understand. Also, the fact that any policy that deployed can't be changed unless a new version is created is not helpful either. Developers are required for initial set up and creating new vocab etc. Business users can definitely change variable values that are already set up. They will need some training to do more than that.

I would imagine that MSFT understands these concerns and we can definitely see improvements in subsequent versions.

Tuesday, July 31, 2007

Biztalk Best Practices

We recently finished a Biztalk Project with one of our client. It was a 4 week project with tight deadlines. Our team did a great job to deliver quality product on time. I joined the project midway after 2 weeks and finished error handling and fine tuning. The overall architecture of the project was straight out of “Biztalk Best Practices” book. This was one of the reasons it turned out to be successful. I would recommend everyone this approach as documented by Marty.

Scenario:

1. Main Orchestration A will be kicked off by a web service or a File Drop (as back up)
2. Orch A will send Email informing users using Orch B (Email Orch).
3. Orch A will then trigger 5 Different Orchestration (C,D,E,F and G) in a pre-defined sequence. 4. This sequence will be set up during design time. Some of these orchestrations will execute in
parallel.
5. Each of these orchestrations (C, D,E,F and G) will call Orch B after each call before control
goes to next orchestration in sequence.
6. Users will click on Email and that will trigger web application. This web application will send a "correlation" ID back to one of the orchestration (C,D,E,F and G) and that will complete the orchestration. Thus sending control to next orchestration.

Our approach:

1. Use Multipart messages
2. Use Direct Binding through Message Box to link multiple orchestrations.
3. Use separate project Internal and External Schemas
4. Never expose Internal Schemas directly to entities outside Biztalk.

Great deal of information can be found at:

http://msdn.microsoft.com/msdnmag/issues/07/05/BizTalk/default.aspx

We could have also gone with the (Partner Port) approach. This approach would have reduced some of the Orchestration code we had to put for changing the promoted values to eliminate the chances of "infinite loop". But then we would have lost on "flexibility". and client demanded flexibility.

Wednesday, July 25, 2007

Using Correlation in Loop

Using Correlation set within a Loop is not a straightforward process in biztalk. One way around this is to define a Scope and within the scope define the correlation sets . This scope will be within the loop . With each iteration of Loop, correlation set will be instantiated again. This approach can lead to overheads due to addtional persistence points in form of transactional scope. Another approach that I recently used in a direct binding scenario is use of "Null Adapter". Have a dummy send shape outside the loop that initiate the correlation set. Use "Null Adapter" with this send port.

DebugViewer in Biztalk

"DebugViewer" is another cool feature that comes with Biztalk 2006. Actually it is part of Studio 2005. Simple steps to use it:

1) Download and Install the tool from http://www.microsoft.com/technet/sysinternals/Miscellaneous/DebugView.mspx.

2) Use System.Diagnostics.Debug.WriteLine ( " Write the Message Here") inside Orchestration to display the message.
3) Check the capture menu on tool to ensure that all options are check marked except the last one (Log Boot). By default they are.
4) You can even filter the events on the viewer. Add a prefix before the message and use this Prefix to filter. Example: Use System.Diagnostics.Debug.WriteLine ( Prefix +" Write the Message Here") . Filter gives both "include" and "exclude" options.

PS: Ensure that you are using "Development/Debug" mode while deploying the assembly. Debug command does not work with"Release/Deployment" mode. This is added benefit of using Debug namespace. There is no need to remove these lines of code while moving from development to deployment version. These lines will be supressed in the deployment version by design. Not quiet the case if System.Diagnostics.EventLog.WriteEntry namespace is used. In that case, these lines will have to be removed when moving from development to deployment version. Else there will be performance hit.

Friday, June 22, 2007

SWIFTNet Adapter

I started looking into SWIFT as an apparent area of push by Microsoft into financial sector. For people who have not worked in Financial services sector. SWIFT stands for Society for Worldwide Interbank Financial Telecommunication”. SWIFT is both a standard body and a secure network like VAN. It also provides software solutions. This secure network connects various banks and corporations. Helping them to settle payments resulting from various buy and sells trades. It is also known as SWIFTNet.

Microsoft introduced first SWIFT accelerator with Biztalk 2004. Biztalk 2006 contains updated version of SWIFT accelerator (version 2.3). Partnership with SWIFT to develop SWIFT accelerator and out of box Rules via Rules Engine is an important strategic move from MSFT to capture Financial Services Market. It is aimed at targeting Back Office where all the trade settlements take place. Java technologies reign supreme in the front end of the trading spectrum using FIX protocol. MSFT aims to capture the back office market that is relatively less volume per day. Here, more emphasis is paid on Security and Reliability and less on speed.

Microsoft is leveraging its influence on Global Banking Industry to push SWIFT standards and SWIFT-BizTalk combination. Internally, MSFT is in process of implementing this solution for its Treasury/Accounting department. It is using in-house implementation as a case study for various corporations and banks to implement Biztalk-SWIFT combination.

Technically, it is a great solution. Users can use Infopath templates to change the Messages. Then use Biztalk as single point of contact to connect with SWIFT network. All the Relational validations in incoming/outgoing message are created using BRE and hence are easy to change by non-technical users. This provides greater degree of flexibility and control in the ever changing world of Banking/Financial Regulations.

Monday, May 14, 2007

Passed my Certification Exam!

I passed by Biztalk 2006 Certification Exam Today. I would say that I am more relieved than happy. I have One regret though, I wish I was working on Biztalk in my Current project. My last Biztalk Project was 3 months ago and it was heavily tilted towards Biztalk 2004. Ongoing real world project experience in Biztalk 2006 would have definitely helped my preparation . My preparation was spread just over TWO weekends. I used 2 weekends (4 days) to go over Two Books: Biztalk Recipes - 2006, Biztalk - 2004 Unleashed. Tried my hands on SDK samples for BRE/BAM and read as much I could read from Microsoft website. I was sleep deprived on these weekends :)

I will not recommend the approach I took. I had to take this approach because of my situation. Ideally, I would recommended to spend at least 2 days on doing BAM, BRE and other examples from SDK. Working on ongoing Biztalk project will definitely give an edge while preparing for exam. There will be no need to specially allocate time to prepare for the exam like I had to do. I knew that Orchestration, Messaging and Administration were my strong areas. For this reason, I made an effort to focus more on areas like BAM, BRE and Role Link that are not my strengths.

Percentage breakdown of my preparation:

1. 50% of my time on BAM, BRE and Role Links
2. 20% on Orchestration, Messaging
3. 10% What is new in Biztalk 2006
3. 10% on Deployment and Administration
4. 10% on Random Readings on Biztalk Blogs.

Saturday, April 21, 2007

Wild Wild World ....

It has been a while since I sat down in front of my computer to do any Blogging. I joined a new company in February and then took a 2-week vacation. I came back and have been working like crazy. Current project has been very challenging and a great learning experience. It is not an Integration Project and I am diversifying into area of Business Intelligence. I am working on collecting Business Requirement for a Financial System and also doing Project Management. Also providing direction to the financial department in the face of changes they have been going through lately. Doing this I am facing the classic dilemma: to diversify or to specialize?

I love working on Business Integration Projects involving Biztalk and this is my core strength. At the same time I want to get more exposure to other aspect of technologies to give me a broader perspective while dealing with Business Problems and providing the solutions. Well, there are definitely pros and cons of each approach. One of the immediate challenges I am facing is that I have my Biztalk 2006 exam due in less than a month and I still have no plan in place to prepare for the exam. I have been struggling to take out some time but the present project is really demanding and does not leave me anytime on week or on the weekend. The only time I get is at night but then I am too tired to do anything. Fact that I am not working on Biztalk project also does not help me in putting my mind in the right Biztalk perspective at the end of the day when I sit with my Biztalk book to prepare for the exam.

I really need to find a way to put a plan in place and I have to do it soon! Wish we had a 40 hour day.

Thursday, February 22, 2007

BAM Deployment Error



I received following error while deploying BAM project in Biztalk 2006. Error read “BAM Deployment failed.
Encountered error while executing command on SQL server “server name”.
Incorrect Syntax near ‘@@Error’.
Must declare scalar variable “@@UniqueID”.”

Screenshot of the error is below:



I opened the tracer log at SQL server level. I extracted the stored procedure that was giving me an error and ran the stored procedure directly on the server. I got following error:

Msg 102, Level 15, State 1, Procedure bam_Test1_UpsertInstance, Line 15
Incorrect syntax near '@@Error'.
Msg 137, Level 15, State 2, Procedure bam_Test1_UpsertInstance, Line 68
Must declare the scalar variable "@@UniqueID".
Msg 137, Level 15, State 2, Procedure bam_Test1_UpsertInstance, Line 111
Must declare the scalar variable "@@UniqueID".
Msg 137, Level 15, State 2, Procedure bam_Test1_UpsertInstance, Line 140
Must declare the scalar variable "@@UniqueID".

I changed the two variables representing the name of BAM activity “Error” and “Unique ID” to “Exception” and “TransactionID” respectively. Ran the command line for deployment and Bingo it worked!!

It seems that Biztalk uses these 2 variables (Error and UniqueID) internally and will not accept them as user defined variables. There may well be more than two of these variables that will cause similar errors. Well, if we get similar error again then it is very likely that we stepped on some more of these variables.

MOM Architecture

MOM is another great tool to monitor distributed Biztalk environment. It can easily configure alerts to trigger some activity. This activity can be like sending an email to a person or a group. It can be running some batch file or a script. This event/error based triggering is very helpful while administering a widespread distributive environment. It enables people to take necessary action at right times.

The basic components that constitute MOM are captured below.



Wednesday, January 24, 2007

Another Successful Implementation

Months of hard work finally helped us execute another successful BizTalk Implementation this week. This was the reason that I was not able to Blog for past few months. I spent past 3 months managing and organizing the activities for the BIG implementation. It was a great team work and everything fell in place. Hours and Hours of work finally paid. Impact of implementation can be understood from the fact that entire system was off line for the users for almost a week! There were about 5 cross functional teams with about 100 people working on this Big Implementation. World wide offices were under the impact. Biztalk was a major part of the complete implementation if not the entire implementation. Every thing has to be timed to perfection between all the teams to get flawless implementation.

I also want to share a piece of code that I created as a workaround for existing process in place. With the recent implementation the transaction volume is projected to increase FOUR folds both in frequency and size. The present implementation is very database centric as all the business validations are run through queries in Oracle. This leads to instance of “connection pool time out” issues. Even though the code is optimized, this issue is something that we have not been able to completely eliminate. There are definitely some ideas architecture wise that can help. But we neither have budget nor time to implement them. So the workaround for that is to control or throttle the number of transactions passing through Biztalk. Since we can’t tell users not to send files in multiple batches, one thing we can do is to control the execution from our end.

The Plan:

1) Create a common folder that will receive the files
2) Create a way to copy the files from this folder to biztalk pick up locations and then delete the files. The following code is the core of the solution that will do the job.
---------------------------------------------------------------------------
Private Sub Button3_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button3.Click
Dim fileNameCollection() As String
Dim totalFiles As Collection

Dim ActualFilename As String
Dim LastIndex As Integer

Dim SetFolder As System.IO.Directory


fileNameCollection = SetFolder.GetFiles("C:\FileMover\") ' set source folder
Try
Dim i As Integer
For i = 0 To fileNameCollection.Length - 1
LastIndex = fileNameCollection(i).LastIndexOf("\")
ActualFilename = fileNameCollection(i).Substring(LastIndex + 1)
System.Threading.Thread.Sleep(10000) (May not be required when used as scheduled job)

System.IO.File.Copy(fileNameCollection(i), "C:\FileMover\Archive\" + ActualFilename, True)
System.IO.File.Delete(fileNameCollection(i))

Next

Catch
Finally

End Try


End Sub

------------------------------------------------------------

This code will be further made database configurable using SQL database. The variables being “Folder Locations”, “Number of files to be dropped” and “Time Interval” between each file.

Another possible use of this application can be during stress testing. Instead of manually dropping the file, this application can drop them automatically after all the parameters are configured. So switch on the performance indicator and go home! When you come back all the test statistics will be written in the file.