Wednesday, August 29, 2007

Implementing WCF

Why WCF?

WCF aka Windows Communication Foundation services is latest product from MSFT for creating Distributive Applications. Year 2007 has been buzzing with new product releases on a weekly basis. It is sometimes hard to keep track of latest releases. I can speak for myself that I end up missing some new releases now and then. It is tough to keep one eye on client work and another eye on MSFT press releases. Now it has become part of job. Working in consulting sector is a 24/7 job. When you are not working with client, you are scanning different websites looking for bits and bytes on latest product releases and technology trends. I do love my job but sometimes it gets really tough to keep up with the pace of technology changes. Sometimes you have to be like Tiger Woods and chose to leave. If only everyone was as good as him!

Distributive Technologies have become center of turf war for technology companies. Webservices is an important piece in this war. Turf is dotted with companies like Java and also open source products like Tungsten.

WCF is a further enhancement of ASMX webservice released with .Net framework 2.0. WCF is part of .net framework 3.0 version. Version 3.5 is out too in form of beta version. WCF lets users get benefit of ASMX , WCE and messaging services as one product. WCF supports MSMQ, HTTP and TCP protocol. ASMX webservice only supports HTTP. Additional transports can be configured to WCF. WCF also offers configuration editor with GUI for monitoring performance, tracking messages and errors. This tool helps in reducing operation cost with out of box monitoring and tracing of data. Details at the end.

It is highly flexible and configurable solution but does have some drawbacks as compared to .Net remoting. Especially when it comes to running multiple service host instances under single host. .Net remoting actually lets you add service instances. WCF can be run as self-hosting service, IIS or WAS. While running WCF as self hosting service, check the Task manager window. WCF service will show up as Services.exe. There will be only one instance of this service at one given time. Multiple instances are not supported.






Demo:

Used MSFT base sample, Organized it and added personal notes and updates
1) Create a Blank Solution call WCF
2) Add a Project and Call it “Host”
3) Add a C# library file and call it “Host.cs”
· Add reference to System.ServiceModel.dll
4) Add another Project and call it “Client”


I ) Create WCF Host


5) Define Service Contract
// Define a service contract.
[ServiceContract(Namespace = "http://Microsoft.Demo.WCF")]
public interface IMath
{
[OperationContract]
double Add(double n1, double n2);

}
Attribute “ServiceContract” marks the Interface to carry metadata for WCF.
Attribute “OperationContract” is equivalent to “Public” keyword. Client can access the service.
PS: Since the datatype used in simple we have not explicityly defined “DataContract”. For complex datatyype it is important to define DataContract explicitly. It is also a good practice to keep “ServiceContact” and “DataContract” as separate implementation.

6) Implement the service

Implementing service contract is a rather easy process. Just use a C# (VB) class and implement the service interface. This is the part where the logic behind the service is implemented.

//Implement the service
public class MathService : IMath
{
public double Add(double n1, double n2)
{
double result = n1 + n2;
Console.WriteLine("Return: {0}", result);
return result;
}

7) Create Host for the service


Till now the service is in form of library class. We will need a host to host the service. As mentioned earlier, there are 3 options:

a. Create Self Host
b. Use IIS
c. Use WAS

In this demo we will create our own host because it is easy to Debug. Before we create a host, we need to define the address where the service will reside.

This step will contain following five steps:

d. Create a base address for the service.
e. Create a service host for the service.
f. Add a service endpoint
g. Enable metadata exchange.
h. Open the service host

// Step d
Uri baseAddress = new Uri("http://localhost:8000/DemoWCF/Service");

// Step e
ServiceHost serviceHost = new ServiceHost(typeof(MathService), baseAddress);
try
{
// Step f
serviceHost.AddServiceEndpoint(
typeof(IMath),
new WSHttpBinding(),
"MathService");

// Step g
ServiceMetadataBehavior smb = new ServiceMetadataBehavior();
smb.HttpGetEnabled = true;
serviceHost.Description.Behaviors.Add(smb);

This step is required to run the svcutil command later on to download 2 client files (config and class). Otherwise user may get following error:

“ There was no endpoint listening at http://localhost:8000/DemoWCF/service that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details.
The remote server returned an error: (404) Not Found.”

This step is optional if user decides to manually create both client files.

// Step h
serviceHost.Open();
Console.WriteLine("The service is ready.");
Console.WriteLine("Press to terminate service.");
Console.WriteLine();
Console.ReadLine();
serviceHost.Close();

II) Create WCF client


8) Run svcutil command to generate client class and config file

C:\Program Files\Microsoft Visual Studio 8\Common7\IDE> svcutil /language:cs /out
:generatedClient.cs http://localhost:8000/DemoWCF/service

2 generated files will be : output.config and generatedClient.cs.

9) Add both of these clients to “Client” Project. The complete solution will look something like the imaage below. Note the added references. Rename them to app.config and MathClient.cs



10) Add the following code to Client.CS

namespace Microsoft.Demo.WCF
{
class Client
{
static void Main()

{

try
{
//Create an EndpointAddress instance for the base address
This Endpoint address should be concatenation of baseAddress variable in Host class and stringaddress paramter in AddEndPoint method of Host class.



EndpointAddress epAddress = new EndpointAddress("http://localhost:8000/DemoWCF/Service/MathService");
MathClient client = new MathClient(new WSHttpBinding(), epAddress);

//Call the contract implementation in Host class from Client class.
double value1 = 29;
double value2 = 16.00;
double result = client.Add(value1, value2);
Console.WriteLine("Add({0},{1}) = {2}", value1, value2, result);


//Close WCF client
client.Close();

Console.WriteLine();
Console.WriteLine("Press to terminate client.");
Console.ReadLine();
}

This is simple scenario where we have single service contract , single implementation and single endpoint.

Scenario:

A new client comes in and wants to a new service contract called “Subtract”.

Two ways of doing this:

1) Add the new service “Subtract” in existing interface IMath
2) Create a new Interface IMath1 for “Subtract”


Add the new service “Subtract” in existing interface IMath

Most simplest of updates. But it does not gives us seperation between both as it is part of same interface. Implementation is still with same class and same service host and endpoint. Only thing that change is on the client side where the client calls “client.subtract” just like “client.Add” in the demo above.

Create a new Interface IMath1 for “Subtract”

Create a new interface say IMath1 that contains “Subtract”. Create a second class say “MathService1”. We can’t use the same class to implement 2 interfaces under the same namespace, so we will need to create a new class to implement the new interface (service contract). Another problem that comes with this is the need to have 2 service shosts. Each service host is tied to unique class (service contract implementation). Since we are using “self created” host we can’t create 2 service hosts to implement 2 unique classes.
This is one of the drawback of WCF. Unlike .net remoting where we can add multiple instants of service host under single host, we can’t do same in WCF.
Using IIS as host can help solve this issue.

Key Points

1) Each WCF service class implements at least one service contract, which defines the operations this service exposes.
2) For distributing different set of service to multiple numbers of clients, class can implement multiple endpoints each with a unique service contract.
3) Minimum 1 end point for each service host.
4) Each service host is tied up to only one implementation of contract (class)
5) To expose multiple services, you can open multiple service Host instances within the same host process.
5)
6) Although you can have multiple endpoints with a single service but only service type.
7) You can also dynamically create service with different implementing types. This can be achieved by dynamically creating new Service Hosts.
8) Each endpoint should have unique relative address




WCF Configuration Editor




This tool can be accessed from Visual Studio interface. Go to Tools ---->WCF Configuration Editor.






Read the help file for details. In summary:


1) Open the config file that was generated after running svcutility. This will populate different placeholders in GUI with relevant data. Click on EndPoint folder to view existing configuration. This information can be changed for the GUI without even touching XML file





2) Click on Diagnostics folder and use Toggle switch to browse through various settings. Check out help file for details.



3) Closing the editor after changes will update the changes in the original app.config that was opened in first step. If we open config file, we see additional tags added to perform all the functions that we proposed in GUI.

4) Click on Host.exe file to start the service. Click on client.exe to start the client. All the events will be logged in file(s) whose location(s) can be set from "Diagnostic" folder in GUI.

Tuesday, August 21, 2007

Rules for Successful BAM implementation

I learnt from my experience that by observing following rules BAM execution can become fairly smooth: Some of the steps are not "need to have" but rather "good to have".

1) Install SQL 2005 sp2
2) Ensure Analysis Services is installed (Else cubes/aggregations can't be created)
3) Do not forget to Configure BAM tools to "Enable Analysis Services for Aggregation" using Biztalk config tool. (Else cubes/aggregations can't be created)
4) Ensure that account being used is a member of the Group that is configured in Biztalk Config tool for BAM portal. (Else views in BAM portal may not be visible to the user )
5) Ensure that the user account under which BAM Application Pool is running is member of Biztalk Isolated Host and IIS_WG group.
6)Run following commands: (Else you may get 401 authentication error)

  • cscript adsutil.vbs SET w3svc/NtAuthenticationProviders "NTLM"
  • setspn -A HTTP/servername domain\account
  • setspn -A HTTP/servername.fullyqualifieddomainname domain\account

domain\account is account under which BAM App Pool is running on IIS.

cscript can be found under: C:\Inetpub\AdminScripts

setspn.exe can be found under the Windows 2003 CD-ROM at \support\tools\suptools.msi


If you view the BAM portal but can't view all the "VIEWS" you created. It is most likely that the account you are using does not have permissions on "BAMPrimaryImport" table. Add that account (i.e, biztalkuser) and then assign the role (view) you want that user to see on BAMPortal. Check the image below:



In the image above. The user (biztalkuser) only have permission for bam_EndToEnd. If you want this user to see other views that you can for example click on "bam_Test1View". User will now be able to see both views on BAM portal. This way you can selectively grant users access to different views(roles).

Please note: SQL 2005 creates Roles. You just have to assign them to the appropriate user. SQL 2005 does not supports concept of "Groups" anymore. Instead use Roles.

Friday, August 17, 2007

Strange behavior of Biztalk 2006 on SQL cluster environment

While testing Biztalk application on server farm mode at one of my client, I came across something interesting. This biztalk application used SQL adapter to connect from orchestration to execute stored proc. All tests for server farm went OK. Biztalk server fail over sequence worked fine. Net scala worked fine for web farm. We ran into issue when we tried failing over SQL cluster from original (initial) node to another. On the original node, everything worked perfect. On failing over to second node, I found that Biztalk application would not connect to the custom database to execute stored procedure. We thought may be it is a connection pool caching issue. We tried restarting the physical biztalk servers and hosts. Nothing worked. Then I thought that may be it is something to do with 32 Bit SQL adapter not working properly on 64 bit cluster (not a very smart assumption on retrospection). We also played with MSDTC settings by reducing the authentication level. Nothing worked. The fact that threw me off was that everything was working fine on original node. We only had one physical copy of entire Biztalk database on cluster and not multiple database instances. So, it did not make sense to me as if why the same database was acting differently with different instances of SQL on each node.

I thought logically for 4 hours and tried different things. Nothing worked. I came back next day and starting thinking out of box or rather illogically. We gave explicit permissions to Biztalk service account on the "custom" database that contained stored procedures. We made Biztalk service account database owner and wolla! it worked! Yeah that simple! We could not understand the behavior but it worked. We thought of doing further post mortem. We took the permission away and failed over to original cluster node. It worked as it was working before. For some reasons Biztalk did not care for permissions while working on original node. We failed over again to second node without the permissions and it did not work again. We granted the permissions and it worked!

We left the solution in that working state! But could not explain the reason for this behavior. We have couple of theories but none of them convincing. Is it a Bug? May be. Is it a cluster configuration? May be.

Tuesday, August 07, 2007

Biztalk and SilverLight

Gloves are off! MSFT is going head on against Adobe. With recent launch of Silverlight, a product that will compete against market leader Flash. MSFT dreams to snatch some serious business from Adobe. With Java version not lagging behind, market has just started heating up.

SilverLight will enable developers deliver rich browser based applications. Great new tool for web developers. Richer out of Box BAM portal is one area that Biztalk can gain from this new development. Rich BAM portal with interactive charts and graphical real time display of data can really grab attention of Business Users. How about adding this on Biztalk 2008 wishlist!

Monday, August 06, 2007

Implementing Business Rule Engine

This weekend I got down and dirty with BRE (Business Rule Engine). It is a powerful tool for managing dynamic business rules. It is very efficient in complex logic scenarios. I thought of playing with it and worked out some scenarios.

Scenario:


Company wants to categorize its workers under different “Bonus” levels to decide on dollar value of annual bonuses. Each bonus level is based on variables like Number of Projects sold, Dollar value of each project sold. Is it a new client or existing one? Other variables like future potential etc can also play in. This business logic can change over period of time and needs to be controlled by Business Analysts/ Management team. That means no complex SQL queries. In real world scenario, this means having a policy that can interact with multiple databases/tables on client side. This policy can be invoked by a webservice . This webservice will pass on objects as facts in the BRE. BRE will then update the specified columns in the database with the correct “Bonus” categories.

Scenarios can range from simple scenarios that take single object as input fact to complex scenarios that take multiple objects.

Proof of Concept :

1) Create a simple database called “TestRules”.
2) Create a table called “Customers”
3) Define 3 columns: Author, ID and Status
4) Create a policy called “Testing”.
5) Under this policy, define a rule “Test1” (Image 1)
6) Define 2 Vocabularies TestBRE 1.0 and 1.2 (Image 2 and 3). Difference between them will be the nature of “Binding”. I will explain that later on.

Email me for complete solution @ shashi_1273@yahoo.com


Image 1:

Image 2 (Database Binding type as DataConnection )





Image 3 (Database Binding type as Data Table/ Data Row )



There are 3 facts in the rule:


1. Name of Author (Coming from Database)
2. Input Object on right hand side of “Contains”. This can be a simple string that gives the lookup value as name of the “Author”. In a complex scenario this can be an XML object that contains multiple Fields for “Author”. These name(s) will be then matched against the “Author” column of database “Customers”.
3. StatuD under the action pane. This will be updated based on above 2 facts.


This policy can then be invoked by a WebService . This webservice will use Policy.execute method after passing the required object parameters. For this Proof of Concept solution we will use following combinations:

1) Passing Single Object as Parameter:

In this case the object passed will be database connection only. The second fact can be a hard coded string i.e. “Shashi”. In 2 object scenario we will use an XML object instead of "string". This XML object will be array of values.

Implementation for 2 object scenario will cover this section too.


2) Passing Two Objects as Parameter:

In this case the object passed will be database connection and a XML document. This XML document can contain multiple fields and one of the fields can be “Author”. This “Author” field can have multiple names. In our sample we have 3 names: Shashi, Kent and Jeff.

a) Database Binding type as DataConnection
b) Database Binding type as Data Table/ Data Row

Both of the above types can be set up while using Wizard to creating Vocabulary. Use DataConnection type if the count of rows returned is more than 10 (http://blogs.msdn.com/biztalkbre/ ). Otherwise use Data Table type. Also for Long Term Facts that are based on Caching, use Data Table/ Data Row Type. Reason being that no additional benefit will be gained by caching Dataconnection object as Connection Pool already does that. Implement IFact Retriever for Long Term facts. For this particular example, we will not use Long Term facts.

Important thing to note is that in case of Database facts ( with binding type Data Table/ Data Row ) and XML facts, Typed objects need to be used. .Net facts do not need any “Typing”. You will see in the implementation how both XML and Database facts use TypedXMLDocument and TypedDataTable respectively.

It is a good practice to create a tracking file to understand how BRE works and this file can also be used as a Debugging tool.

Implementation for Case a)


SqlConnection conn = new SqlConnection("Integrated Security = SSPI;Database= TestRules; Server =(local)"); // establish connection to server

conn.Open(); //open the connection

DataConnection dc = new DataConnection("TestRules", "Customers", conn);

XmlDocument xd1 = new XmlDocument();
xd1.Load(@"C:\BiztalkProjects\BRE\TestSmall.xml");
TypedXmlDocument doc1 = new TypedXmlDocument("Books", xd1);
object[] shortTermFacts = new object[2];
shortTermFacts[0] = doc1;
shortTermFacts[1] = dc; // Note: No Typed dataset being used in this implementation

Policy pol = new Policy("Testing",1,0);
DebugTrackingInterceptor dtraking = new DebugTrackingInterceptor("ShashiOut.txt");

try
{ pol.Execute(shortTermFacts, dtraking);
dc.Update();
conn.Close();
}
catch (Exception ex)

{ System.Console.WriteLine(ex.ToString()); }

Implementation for Case b)


SqlConnection conn = new SqlConnection("Integrated Security = SSPI;Database= TestRules; Server =(local)"); // establish connection to server

conn.Open(); //open the connection

SqlDataAdapter da = new SqlDataAdapter("select * from Customers", conn); // create adapter to fill dataset .

//------explicitly define Update method for database adapter object. Directly using da.update will give error

SqlCommandBuilder commBldr= new SqlCommandBuilder(da);

commBldr.GetUpdateCommand();

da.UpdateCommand = commBldr.GetUpdateCommand();

//...

DataSet ds = new DataSet("TestRules"); // create a dataset

da.Fill(ds, "Customers"); //fill dataset

//....Not required for Single Object scenario
XmlDocument xd1 = new XmlDocument();

xd1.Load(@"C:\BiztalkProjects\BRE\TestSmall.xml"); //xml instance to be passed

TypedXmlDocument doc1 = new TypedXmlDocument("Books", xd1);

//............

TypedDataTable tdc = new TypedDataTable(ds.Tables["Customers"]);

// Only pass one fact (tdc) for single object scenario (First scenario)
object[] shortTermFacts = new object[2];
shortTermFacts[0] = doc1;
shortTermFacts[1] = tdc;

Policy pol = new Policy("Testing",1,0); //Major and Minor Versions

DebugTrackingInterceptor dtraking = new DebugTrackingInterceptor("ShashiOut.txt"); //writes the debug file under the "bin" folder.

try

{ pol.Execute(shortTermFacts, dtraking);
da.Update(ds,"Customers");

conn.Close();

} catch (Exception ex)

{ System.Console.WriteLine(ex.ToString()); }

}

}

}

Take Away :

1) While testing a Policy, BRE create Database connection/objects. While executing the policy from .net class, database objects need to be explicitly created.

2) It is a good practice to create tracking file while executing policy. This can help to troubleshoot issues while development. It can also be compared with the tracking log that is created by Business Rule composer when Testing a policy. This way we can observe the list of parameters that policy is expecting and make sure that these parameters are correcly passed while executing the same policy from outside Rule Composer. By comparing these 2 logs, I was able to figure out when policy is expecting TypedXMLDocument and TypedDataTable.

3) DataSet object and XML document have to be passed as "Typed".

4) The 2 Object scenario can be extended for multiple Objects.

5) I know it is very painful to copy and create new versions of policy whenever any modifications are required. Same applies to vocab. Following SQL commands can ease that pain by undeploying the policies/vocabs and allowing users to do modifications. It is highly recommended that these commands are used only in Development phase.

For Policies:

declare @RuleSetId Int
select @RuleSetID =nRuleSetID
FROM re_Ruleset
WHERE strName = 'Testing'
AND nMajor=1 and nMinor=0
UPDATE re_ruleset
SET nStatus =0
WHERE nRuleSetID = @RuleSetID
DELETE FROM re_deployment_config WHERE nRuleSetID=@RuleSetID
DELETE from re_tracking_id WHERE nRuleSetID=@RuleSetID

For Vocabs:

UPDATE re_VocabularySET nStatus = 0 /* to republish make it back to 1 */
WHERE strName = 'Vocabulary1' AND nMajor =1 AND nMinor=0

I think BRE is a very powerful tool if used properly. Key is to use it for complex scenarios that can leverage functions like Update, Assert, Retract. Using it for simple queries is an example of under utilization of tool. In terms of performance, BRE will work better than traditional SQL query in case of complex scenarios.

It is not truely a Business User tool. Concepts of version, vocabulary and facts are not easy for a business user to understand. Also, the fact that any policy that deployed can't be changed unless a new version is created is not helpful either. Developers are required for initial set up and creating new vocab etc. Business users can definitely change variable values that are already set up. They will need some training to do more than that.

I would imagine that MSFT understands these concerns and we can definitely see improvements in subsequent versions.