Friday, February 25, 2005

Faster way of Deploying assemblies to the servers.

One of techniques of deplying assemblies directly to server without building it on your local machine is:

1) Right Click the Project in Solution Explorer.

2) Go to Properties----Configuration Properties-----Deployment.

3) Select the server name, database and make "Redploy"= true.

4) Click Deploy

The Redeploy="True" enables to deploy over the older assemblies on the server. This way we do not have to undeploy and then deploy.


Warning: Make sure to revert back the settings from your server to your local machine after completing the above steps. Else you might end up overwriting the server solution with your local machine solution (good or bad).

Sunday, February 13, 2005

Synching the Method executions from Orchestrations

One of the problem I faced in one of my project was, in the event of multiple files entering a same method. And if this method is reading information from database. This can result in that method being executed by multiple xml documents and hence exceeding the database connection pool. In order to avoid this, DOT NET offers "SyncLock" object.

Wrap all your methods in this object. This will ensure that second document does not enter this method till the previous one is completely processed.

Example:


SyncLock object name
Call the function here
End SyncLock

Tuesday, February 08, 2005

De-Batching and Re-Batching the Files (Scatter-Gather)

In my current project we had requirement to split the incoming XML files into individual transactions. These individual transactions would then go through some rule validations in Orchestration. Depending on the validation rules set up in the external assemblies, these transactions will be either rejected or accepted. The business need for splitting the file came from the fact that the transaction has to be accepted or rejected as a whole.

Splitting part was easy. We used envelope and document schemas for that. But the difficult part was to batch the split transactions per incoming file they originally belonged too. I will give possible solutions for "Re-Batching" here and also limitations for each.

1) Use a loop within the Orchestration. Add each transactions on basis of node. Co-relation set will have to be set up . This is important as the correlation ID will help us to add only the transactions that belong to a particular file. This approach is useful if we have something like a "Orders" file that consists of various "Items". Order ID will be the correlation ID in this case. This will tie up all the "Items" for a particular "Order ID".

This approach fails in my situation as I have no correlation ID. My sample file consists of a root element immediately followed by child nodes. These child nodes contain all the information. The sample file looks something like:

(a)(b)contains all the data(/b)(/a)

Ideal structure to use the looping would be:

(a)correlation ID here(b)contains all the data(/b)(/a)

2) Loop it and add on basis of the count.

Problem with this approach is that in scenario of 10 input files being dropped at the same time and each having 10 each transactions. The count will not ensure that the transactions we are adding are actually coming from a single file (as we need in our case). It is not guaranteed that the Transactions in each file will be processed in a sequence.

3) Third approach works for me. If I set up a unique ID at each transaction level. This ID is same for all transactions in one particular file. Then I can equate this ID as "File Name" in the Send Port and then use "Append". Biztalk will append all the transactions with the same unique file name. This is what I need!