Popular Posts

Monday, August 15, 2011

Denali for the Data Steward


- By David Nolting

Denali CTP was released in July.    You can find it at  Microsoft Denali

We’ve recently finished a Proof of Concept using Denali to model the organizational structure for an international energy provider based in France.   Denali introduces major improvements to MDS through data steward management capability with Excel.  Excel is the tool most used and mastered by the user profiles who will be defining MDS entities.    Denali marks Microsoft’s foray into the MDM space with its superior interface with Excel and its very nice Silverlight improvements to the MDS interface.
We could go on about the functional improvements introduced in Denali but much of that has been already described on the web site.   An excellent source is this channel 9 video TechEd 2011 Denali

Real World MDM

In France, MDM is becoming a topic of discussion in IT urbanization.  As data needs to be published universally across the enterprise, definition of entities and aggregation of data from disparate sources seems to be the best approach to accomplish this.  Our client has been involved in many middleware projects using BizTalk for inter-application integration; however, the motivation to implement an MDM POC for universal data publishing surpassed anything I’d seen with the middleware projects.  MDM was deemed critical because the current data integration environments for centralized data management had become difficult to maintain and unreliable.  MDM, as a business concept is much easier for the business group to get its head around than say middleware integration.

This project pushed MDS features to the limits.  Key POC criteria consisted in the following:

·        Ability for the Data Steward profile to Model Data and control the Data Entry process
·        Ability to Integrate Tight Business Rules
·        Ability to Create Work Flow around the data approval process
·        Ability to notify appropriate data owners when Business Rule anomalies occur
·        Ability to Partially expose Hierarchy Information ( nodes ) to defined users and user groups
·        Ability to subscribe to multiple versions of the data and the current version of data
·        Ability to import Data Subscriptions into Excel
·        Ability to Roadmap SOA Architecture for system and portal integration

MDS has all of these features but getting things to work as expected is another matter.   The majority of time was spent solving Business Rules, Workflow, and Security aspects of the POC. 

User Profiles

As with most projects and POCS, it is important to understand who will be doing what to achieve successful MDM.   Our client defined two principal roles:

The Data Administrator

The Data Administrator would be working with modeling ( building hierarchies ), Business Rule definition, Security assignment, Version Management.   This profile would manage the MDS artifacts associated with a model version.

The Data Steward

The Data Steward defines entities, manages the quality of data, and participates in the workflow process used to integrate data in the MDS model.  The data steward receives notifications of data anomalies.


MDS vs. the Competition

Our proof of concept pitted SAP MDM against SQL Server R2 MDM with Denali.   Our customer currently uses Business Objects for B.I. and SAP for ERP so MDM solutions from SAP where politically advantaged.  
Key business and technical drivers defining the POC included a budget concerns, implementation complexity, and ability to meet basic design criteria – Modeling, Hierarchies, and Security. This was a first time, real world exercise in MDM for the client.   Motivation was very high.   The project was driven from the IT dept. but the business group quickly became stakeholders in the project. A comparison of SAP MDM and Denali MDS was made early on which led to the decision to select Denali MDS as the more appropriate technology for the POC.   SAP MDM had more features and was deemed more complete.  But, if Denali could meet the baseline requirements of the POC, it would be the technology of choice for MDM for at least 2 projects within the organization.   Why was this decision made?   Basically because the SAP MDM solution was considered to be too complex to implement for the POC and too expensive for the scope and budget of the project.   Denali was pitched as an MDM lite solution that was much easier to implement and could meet the baseline requirements of the projectReachSOA was called in to help pick up where Microsoft training left off.  We worked on building architecture around MDS to round out the Denali offering so that it matched some of the “off the shelf” capability touted in the SAP MDM solution.  

Before we get to that, here are some key functional takeaways from the POC:
       
     Data Modeling

The Data Administrator modeled and remodeled Derived Hierarchies using the Excel plug in.   What’s important is that CSV data dumps from various systems were rapidly modeled in Excel.  This phase was drastically easier to perform than in the previous version of MDS.


The Data Steward easily defined and modeled about 10 different entities that reflected real corporate data.   The exercise in MDM Entity identification and definition was the first step in helping business formalize data relationships and organize data more strategically.  Entities prior to this exercise had not been formally defined.

Data Modeling using the Excel Plugin proved to be easy and intuitive enough to allow the Data Steward to define entities and established derived relationships between entities.   The excel plugin reduced the time necessary to import and stage the data drastically from the previous version of MDS where SSIS or other means were required to import the data.   The plugin adds high business value to MDS by transferring entity definition and data import capabilities to the functional Data Steward.

           Version Management

POC expectations required us to focus on version management capabilities built into MDS.  The POC had to successfully demonstrate how the Data Administrator could define and manage versions of the data. 
Version Management requirements for the POC included:  Version creation, Version Backups, Assignment of the Current Version, Fall Back to a previous versions, Version Data Subscription management, and Version Delta Reporting.  MDS was capable of meeting most of these requirements.  The only area where it seemed to fall a little short was in the Delta reporting.   We showed Transaction Reporting to detail modifications made to models, entities, and data which seemed to satisfy the client.   We showed that transaction data could be mined to create simplified delta reporting but a one touch solution – like a database diff report – was not available off the shelf in MDS.  Other version management features were very close to what the client expected.   The Version Flag can be used to guarantee that subscriptions retrieve data from the most recent version as shown below:


Figure 1 - Subscription Export based on CURRENT Version of Data

This helps with Change Management control for subscribing services.   These subscriptions can be mapped to Service Data Transformation Objects to assure that connected applications move seamlessly from one version to the next of the MDM data.

 - Versioning is intuitive and flexible enough to meet the demands of the POC.   We showed how the Data Manager can transition from open to committed states and report ancestry:






Figure 2 - Version Ancestry

    Security

Security was a critical aspect of the POC.   MDS had to show how security could be managed at the attribute level of an Entity.   Hierarchies structured by business units had to assure that managers could only visualize and modify entities in the business units  they managed.  Basic security features were easy to modify; however, we ran into challenges trying to configure MDS to block out Hierarchy Members for unauthorized data managers.  We had to contact Microsoft who, in turn, contacted experts in the U.S. to configure our Beta copy for Node level security in Hierarchies.   

   Once we got over the hurdle of security at the Hierarchy level, we successfully showed security requirements for the POC application.  However, without this support, we would have been required to resort to collections to expose partial hierarchies.  The built in security features worked perfectly.




    Business Rules

The Data Administrator managed to define a set of business rules that validated data.  Since data is imported from disparate sources, basic data cleansing and normalization was easily performed through the Business Rules layer.  


      Email notifications were assigned to Business Rule anomalies.   MDS configuration was simple enough and worked 100 % for the POC.

        
     Work Flow

Workflow was built into the POC.   Workflow did not consist in WF modules linked to Business Rules.  The Data Manager assigned a simple workflow leaf attribute to the principal entity.  When the entity was initially imported, the flag was set to “Needs Approval” and a notification was fired to the data steward.   A link to the Data Entity Entry screen in the email allowed the Steward to review, and approve or reject the Entity.  This was a somewhat acceptable approach for the POC; however, true WF integration with access to a SharePoint portal will probably be the preferred approach for the actual implementation.   The advantage of the WF triggered business rule is that the data is not integrated in the MDS database and will, therefore, not be visible in export subscriptions.


SOA


Exposing services that consume MDS entity and aggregated data is an important requirement long term for an MDM solution.  Without getting into too many technical details we showed ODATA services modeling of MDS entities.  This approach will work well for read only consumption of data from MDS.   Services should be decoupled from the MDS structure through versioned DTO ( Data Transformation Objects ).   This will allow for change management of underlying MDS data without directly impacting services. We look forward to working with the MDS WCF services that allow for entity creation, data publishing, and basically the features available in the Excel plugin. 


        
Change Management


Change management should be built into MDS and the outlying SOA solution from the beginning.   To reassure Business, we addressed Change Management and Versioning strategies at every stage.  This included Model versioning and versioning of services

POC Results

As mentioned earlier, the POC was sponsored and managed by IT.   The Business Group did not see the results until the final presentation.  MDS and our presentation fared well in front of the Enterprise Architects and Business Process Owners.  The BP Owners locked into the potential early on looking at the Excel Plug in and the Excel export as a practical means of managing data.  Seeing real data restructured in MDS hierarchies spoke volumes.   The lead EA appropriately challenged roles, infrastructure requirements, and who would be in charge of managing the new data source.   IT’s involvement in the design and functional application modeling will provide strong backup and support for MDM moving forward. 

Fortunately, a project will be launched in early September, after “les vacances bien sur”!
  

Monday, May 9, 2011

MDM over ODATA : Extending MDS with ODATA Services and Entity Framework

B    - By David Nolting
Introduction
In my last post I discussed the benefits of using SQL Server Master Data Services to build reliable MDM solutions.  That post covered modeling and a quick comparison of MDS against required MDM features.  In this post, I look at extending MDS with ODATA services so that Entities defined in the MDS model can be consumed through clients relying on JSON and ATOMPUB standards.  ODATA services work exceptionally well with the concept of entities so it seems to be a natural tool for publishing MDS entities.

Solution Architecture
For important reasons, MDS tables are very metadata centric allowing it to manage versions, models, and staging.   So it is not practical to connect directly to the underlying MDS tables.   The way we extract readable entity information is by creating MDS Subscription Views in MDS which are nothing more than auto-generated SQL Views.   These views create a representation of MDS entities that match the model definitions.   Once the views are created, we can perform LINQ to SQL queries to expose entities; however, ODATA works so well with Entity Framework why not model the Subscription Views in Entity Framework and expose the EF model out over ODATA services?   This solution provides a very small code footprint and can be built with a few clicks.  It allows consumers to filter entity data on any of the existing entity attributes.


MDS Subscription Views to Entity Framework
MDS allows you to auto-generate SQL views for entities.   An example view for the sample customer entities looks like the following:



With that, it is easy to create an Entity Framework model based on the subscription views.  I created a basic .Net class library that contains an ADO.NET Entity Data Model:



I create the model from the Customer View :


The Entity Framework Model contains an entity named Customer which is the same name as the View.   This entity is readable and contains al the master data attribute information associated with a customer:


ODATA Services ( REST )
We now have an Entity Framework model built around our MDS Solution and we created it entirely with the EF wizard.  The next step is to create an ODATA service that publishes Customer entites and allows filtering on attribute information. ODATA's built in filtering mechanism can be accessed through HTTP URL query syntax. Our first step is to create a WCF Application project and to add the Entity Framework project as a reference:


I removed the default WCF services created with the project.   Now we need to add our ODATA Service to our project as follows:


The service generates the following code:


We merely have to reference the EF Model’s entities in the code.  The service is created locked-down so we have to open up access to the entities in the model. 

Once compiled, we can view the ODATA service in the browser ( right click on the service and select View in Browser ).   Accessing the URL http://localhost:56408/MDSService.svc/Customers returns all the customers in MDS:


With ODATA filters, I can query the data on any attribute.  So, for example, queries on the City attribute results in a filtered list of MDS entities.



Another filter on the SalesDistrict_Name returns the following:



Conclusion
Within a matter of minutes and with modification to just 2 lines of generated code, I was able to wrap Entity Framework around MDS and expose my MDS customer entities over ODATA.   Once available as an ODATA service, consumer applications can query on any attribute in the MDS list.    This is a powerful and very easy solution for SOA over MDS using the Microsoft Technology stack.   Off course, this is just a start.   In a real world application, ODATA operations would more than likely be added to provide more powerful filters and analysis on MDS entity data.  This would involve writing some LINQ code against the Entity Framework Model.   But often, MDM systems need a quick way to expose raw entities over SOA for referential checking and this solution seems to fit the bill.


Monday, April 11, 2011

High Performance MDM with AppFabric Cache, BizTalk, and Dynamics AX


- By David Nolting


The Context

Recently, ReachSOA architects were employed to help model a referential architecture for a renowned, international sports federation that managed both affiliates and public customers for public sporting events. The combined affiliate and public customer base ranged in the millions. At the time we were called in, they were at the tail end of a revamp of the CRM and ERP applications. As with most organizations, the affiliation relied on several disparate applications to manage affiliate membership, online ticket sales, and front office invoicing and back office order fulfillment. Microsoft Dynamics 2009 was chosen as the CRM product and would eventual morph into a full blown ERP solution. The ERP would become the MDM "source of truth" referential data source for customers (CDI), and products (PIM), as well as a historic base for product sales from the online and front office applications.

The solution was a classic BizTalk / Dynamics AX data synchronization model using BizTalk and the AX adapter:


 


The SLA Challenge
The integration between the affiliate database, BizTalk, and Dynamics AX proved to be very reliable. However, the integration time in AX was costly: database staging and AX business integration rules required to validate and integrate customers took time. BizTalk would deliver customer data to the AIF layer where it would be subjected to a lengthy staging process. Customers created from FrontOffice and Online Sales applications need to be made available for product fulfillment within minutes. Solicitation of the customer master record could occur from potentially thousands of connected users ( an SLA of 15 000 simultaneous connections ) was established.



At first, the architects looked at exposing Dynamics AX web services for consumption from the Front Office and Online applications. However, many technical problems such as performance, AX Business Connector Licenses, and "real time" reliability of the MDM database were put into question.



ReachSOA Analysis
We suggested a POC limited to 5 days to define and test an architecture using AppFabric Caching of the MDM. The cache would expose Pre-MDM services that would allow for recovery of master customer and product data and the ability to create the essential MDM Customer and place it into the cache so that it is available for consumption from the product fulfillment process within the state SLA of minutes ( in fact, it would be instantaneous with the Cache ).

To cut to the chase, we modeled the following technical architecture which combines synchronous and asynchronous flows using AppFabric and BizTalk:




 


 

We identified a series of Synchronous and Asynchronous flows. AppFabric Cache would be used for Synchronous Customer creation: a web service would allow for the creation of a customer from a subscribing app. The customer is updated to the cache and pushed to BizTalk for an asynchronous update to the MDM system. Ideally, MDM business rules would be called by AppFabric to pre-validate the Customer Information before it was updated to the cache. Business Rules integration with MDM was road mapped for a production version.

Rebooting the cache meant reloading the cache from the AX database. Fortunately, a cache reboot executed in about 30 minutes for millions of customers on a very modest VM platform.

We structured the cache so that it remained online at all times and a reboot merely updated entries in the cache rather than dumping and rebuilding it from scratch.



BizTalk Integration
We were able to update the cache from BizTalk Transactions that came from the Affiliate database. This meant that Affiliate information would be published to the MDM Cache well before it was staged into the MDM database meeting SLA requirements. BizTalk called a .Net Assembly that we built to wrap the AppFabric Cache API. Below is the component architecture of the Cache solution:






 


  Once the cache api was wrapped in our .Net Service Facade,  it was easy to hook up BizTalk and the cache :






 


 

WF Integration
We promoted use of WF ( Workflow Foundation ) to model the Creation/Update Process. This Work flow manages a transaction between the cache and the asynchronous integration tables that BizTalk would use to pickup and pass modifications to AX. If an update failed at the cache level or the SQL level, the transaction is rolled back and a fault is returned to the consuming app.







Functional Testing
The functional results of our POC were clearly positive. We managed to demonstrate flexible and rapid queries against the AX referential store thanks to AppFabric cache. The POC also demonstrated strategic possibilities such as full blown search and creation of SOA business domains for the affiliation.



Performance Testing
Performance testing was limited to Visual Studio Load tests to hammer the cache platform. Using this approach, we expressed the 15000 user connection SLA in terms of number of transactions we could handle per second; an SLA commonly used in EAI/ESB performance projects. This seemed to fit because the cache would sit well behind the online website and the transaction that involved querying the customer data would be less than 1 % of the entire online experience. Getting to a figure of number of transactions per second would help us project the number of servers required for a full blown implementation. Our cache was distributed across 3 machines as pictured below:






 

We used 3 virtual machines ( VMWare ESX ) to distribute the cache across multiple machines. Our initial goal was to show the High Availability/Fail Over capabilities of the cache; however, we did not have Windows 2008 R2 Enterprise installed on these machines and it was beyond the scope of the POC to upgrade from Standard to Enterprise.

We achieved anywhere from 85 to 95 transactions per second which seemed to largely satisfy the review committee given that we were not working in an optimal environment but had merely grabbed a few available machines.





We performed tests such as the following:


 

  • Test 20 minute duration
  • 104 819 tests completed
  • 87,3 tests / sec
  • 210 threads (limitation VS Load Test / CPU)
  • 0.35 Avg Test Time

A few Technical Gotchas
As our Cache needs to theoretically remain available at all times, we configured High-Availability for the Cache.  One important characteristic for our caching model was to eliminate eviction.  Regardless of hit ratios, we needed all data to stay in the cache.   We stumbled upon this during the POC.    In addition, we will need to activate notifications in the event the cache drops entries due to eviction.  So, some time needs to be spent to understand and configure precisely the cache configuration.  Perhaps in another post, will drill down to some of the technical issues.   This post is more high-level and shows the business value of the technical architecture. 


Conclusions/Opportunities

The POC not only met but surpassed expectations. The POC restitution helped underline future potential for the solution. Here are the key take away points from the POC:

Practical Points

  • Inexpensive Solution
  • Rapid Development
  • High Performance

 

Opportunities

  • Potential for Full Blown Search
  • First step towards MDM + SOA Enterprise Architecture
  • Opportunity to Provide Service Inventory for Affiliate databases ( without creating another Silo )
  • Best Practices and Governance in SOA
  • Cloud ( public or private ) elasticity

 


 


 


 


 


 


 


 

MDM with Microsoft Master Data Services ( MDS )

- By David Nolting
Recently, we've been pulled in to look at how MDM ( Master Data Management ) can be achieved with Microsoft Technology. The obvious starting point is to look at the SQL 2008 R2 MDS ( Master Data Services ) component that is built into native SQL Enterprise Architecture.

In a nutshell, the MDS offering looks quite promising. It is lightweight in its scope which is probably an advantage to reduce scope and complexity for first time MDM implementations.

MDM is often described as an SOA Enabler. We're finding that MDM thinking is seeping into the mindset of companies, even in France, for various business reasons: managing too many disparate data sources for customers, products, and other business entities is too costly and involves too much financial risk when errors are propagated across business domains.

We have just completed a POC using MDS where the goal was to use MDS to replicate an ERP based MDM model.

In case you're not familiar with MDM: most enterprise data has grown with the enterprise through silos like this:
Here we see arrows going all over the place. An MDM solution lets you get rid of lots of these arrows as follows:
MDM becomes the "source of truth" for entity management.

What an MDM solution should be: To really review the MDS offering and see how "complete" it is, we looked to a great source for MDM : Enterprise Master Data Management: An SOA Approach to Managing Core Information by Allen Dreibelbis . This is an exhaustive, agnostic look at MDM. So here, I'm going to look at their MDM solution feature list and see how MDS stacks up. Here are the essential features:

Master Data Life Cycle Management: manages all the aspects of creating, versioning, and security access to data authoring. MDS does provide strong life cycle management capabilities such as versioning, hierarchies ( derived, explicit and derived with explicit caps ). However some of the features we did not find in MDS were:


  • Flexible mapping capability across business domains ( models in MDS ). Entity mapping across MDS models does not seem to be natively supported. For example, if you define a CDI ( Customer Model ) and a separate PIM ( Product Model ), entities across model domains can't be mapped in hierarchies. However, there are ways of getting around this through modified subscription views.
  • Versioning is supported at the model level and provides Model Locking and Version Ancestry Analysis. Versioning at the entity level seems to be a strong requirement. However, MDS provides notifications for data governance issues for data stewards. It seems that the best way to handle entity versioning is at the Model level with sub-versions. When using something like ODATA services over REST, URLs could point to different model versions that expose entities according to the macro model version number. So this doesn't seem like a big issue to me.
Aside from these exceptions ( which can be handled through minor workarounds ), MDS seems to provide the base Life Cycle Management functionality including Audit, Authoring Security, Data lineage, Referential MDM, multiple taxonomies, etc.

Pictured below is a Derived Hierarchy based on Geographical Sales. Derived Hierarchies are inherent, attribute based taxonomies.
Data Quality Management Capability: ability to provide data quality management as data is loaded into the MDM, and real time data governance to notify stewards of data issues. MDS seems to cover this quite well through a Business Rules Processor with extensibility through WF workflows if more heavy lifting is required or if human review is necessary.

MDS provides an intuitive Business Rules Layer as pictured below:
MDS provides Business Rules that are versioned and published and provide IF / ELSE / ACTION behavior:
These business rules allow for basic attribute value checking for data consistency and data logic. Business Rules can be neatly tied to Windows Workflows that are published as services in SharePoint or an SOA Server:
A couple of features such as Data Reconciliation and  Data Cleansing are shoved to the ETL ( SSIS ) products and are expected to be performed before staging data in MDS. Data Reconciliation or the ability to identify duplicate entities ( a customer name or address is slightly different ) can be a complex procedure. The most sensible, low cost way to identify data duplication could be through checking for entity likeness and reporting suspected duplicates to the data steward. So you could image the ETL or EAI procedure that is loading your data could query the SOA layer for a Customer-LIKE operation based on name, address, telephone and if > 0 entries are found, a business rule could be triggered to notify the data steward and reconcile the data.

So it seems that Data Reconciliation and Cleansing need to be handled outside of the MDS product.

Master Data Harmonization: is a fancy way of saying you need to get data from disparate LOB applications and possible unstructured data sources into your MDM solution and vice versa. MDS exposes staging tables and it's your responsibility to load them via ETL or other means. Microsoft provides lots of answers to Data Harmonization such as BizTalk for high end, messaging integration and SSIS for basic, good old fashioned  ETL loading of staging tables.  We will look into how an AppFabric WCF service could be exposed to help with data duplication issues. Such a service could be consumed by BizTalk in a messaging scenario or perhaps in the Business Rules layer as a default work flow check on all incoming entities. Any entities suspected of duplication could be flagged and sent to another workflow for review. 
The Microsoft Middleware stack provides lots of options to enrich the MDS layer. It's obvious that SharePoint and WCF will be the candidates of choice to extend MDS. But BizTalk, which naturally plays a role in data synchronization between an ERP and disparate apps, could be used as well for asynchronous message transfers and pub/sub bus like interaction with MDS. For lightweight enterprise application integration, AppFabric Connect, with the help of WCF Adapters and the BizTalk Mapper, could also be used, especially for synchronous situations. To be honest, we need to dig a little deeper into the MDS notifications to see how BizTalk or other systems could subscribe to MDS artifact modifications so as to update a subscribing system. AppFabric Cache is a strong candidate for performance in the event that thousands of online users could, for instance, request product PIM information from a web application. In this case, an MDM model could be published to the Cache and made available for online queries.

Analysis and Insight Capabilities: MDM solutions should provide inline analytical insight capability to help establish cross relational data and potential business opportunities. This seems to be the realm of BI and perhaps the StreamInsight offering which is built into SQL Server. Good modeling should help with insight capabilities as well. Take for example a banking account model that should tie together an individual with all his accounts ( mortgage, savings, investment cds ) and would be able to identify relationships such as spouses, siblings and remote family members. Business Insight coupled with good modeling would identify, for example, a mortgage loan opportunity as a result of a change of marital status for one of the siblings, or college loan opportunity as a sibling nears high school graduation date ( i.e. 18th birthday ), etc. StreamInsight could subscribe to events triggered by the MDS model and process them for business opportunities.

Building an SOA Layer on MDS

MDS data tables cannot be reliably queried directly and even if you reverse engineer the tables, developing such as strategy is risky because future product versions may change the structure and therefore break your query model. What MDS does provide are subscription views. Currently, subscription views allow for queries against single entity types, or derived hierarchies, and render a readable version of it with columns that match the attribute names. A view generated by MDS resembles the following:
As you can see, the TSQL code has to do a lot to render a technical, meta data driven mds entity into readable output. But once the entity view is created, composite views can be developed to join entities together and to filter data:
Subscription views make it easy to get to the entities and from there you can build upon the queries through LINQ, Entity Framework, or simple ADO.net to publish the data out through WCF Services. MDS could be a great candidate for ODATA WCF services to publish entities as urls. There exists a WCF api with MDS that I'm sure could be leveraged. However, we couldn't find much documentation on it so I guess it would require some trial and error. The API seems to provide full authoring and lifecycle services as well as business rules.

The business rules WCF services would be ideally used to prescreen data from an external system to see if it will pass the MDS business rules prior to shipping it to MDS or another application.

Conclusion

Our POC delivered a satisfactory model that was able to replicate the model defined in the ERP system, and we did this within a couple of days including taking the time to work through MDS examples to fully understand its Modeling API. A second phase of the POC is required to dig deeper into MDS to define the most suitable Data Integration and Harmonization techniques ( BizTalk, SSIS, etc. ) and to roadmap SOA extensibility by digging into the MDS WCF API or by building business domain services that tap into subscription views.

We are getting more and more requests to look into MDM for obvious reasons, and now that Microsoft SQL 2008 R2 provides MDS as a core MDM component, we are more than happy to add MDS to our integration stack. We've just started to scratch the surface and have concentrated on the real business value of MDS which is Modeling. The extended potential of MDS as an SOA backbone will require help from other components in the Microsoft stack such as WCF, WF, SharePoint, and potentially BizTalk. And, of course, an architect will be needed to glue it all together.

Great Resources for MDS and MDM

http://sqlblog.com/blogs/mds_team/archive/2011/01/11/new-master-data-services-training-available.aspx

http://www.packtpub.com/microsoft-sql-server-2008-r2-master-data-services/book

MDM Book: Enterprise Master Data Management: An SOA Approach to Managing Core Information by Allen Dreibelbis

 

Monday, February 28, 2011

Welcome to ReachSOA


Welcome to our blog.   
We will be blogging on unfolding events as we experience them in Enterprise Solution Architecture.  
We are based in Europe and work with Global 2000 companies, so for our American audience,
there may be interesting points of comparison/contrast with U.S. project experience.
Our focus is on SOA architecture with concentration on MDM, BPM, B2B, and A2A.  As with most companies in our industry, we struggle to promote strategic Enterprise Architecture against most organization’s tendency to build out tactical, brittle solutions even when adopting strategic technology platforms.
We specialize in Microsoft technology but uphold an agnostic approach to Enterprise Architecture and Opportunity Identification in premise and cloud based integration.
We are experienced in EA methodologies such as TOGAF 9.   We are also SMEs on Microsoft SOA and Integration platforms.
Our mission is to advance SOA and Strategic integration to companies helping them achieve
Business Optimization.
Some of our favorite authors on EA and SOA:   Jeanne Ross and Thomas Erl
We will be blogging on both high level, logical architecture and IT governance issues as well as on the nuts and bolts of some of the solutions we have developed.
 Dave Nolting
 Cofounder ReachSOA