Monday, April 11, 2016

Cloud Maturity Model

Progression of a Company's Cloud Maturity from initial adopter of Hardware Abstraction to the end-state Business-centric, agile Cloud




Monday, September 21, 2015

Where is your Organization in the Culture of Trust Spectrum?


Tuesday, July 21, 2015

Cloud Migration - the sun is shining above the clouds

The Cloud is 10 years old, yet it is only now that mainstream migration to the Cloud is becoming a reality.

In truth though, no one makes this journey just once. Hasty migrations risk two major consequences:


• The migration process is ill-planned and therefore likely to encounter difficulties


• The destination environment fails to live up to the unreasonable expectations placed on it

As a  Solution Architect, the following presentation contains my experience and understanding of migrating to the Cloud



Cloud Migration Methodology presentation

Thursday, March 29, 2012

Business Intelligence Maturity Model

As the Business Intelligence Maturity of a company evolves, it goes from reactive questioning - "What happened?" to Analysis - "Why did it happen?" and finally "What will happen and what can we do about it?"

Shadowing this evolution are its systems which are usually spreadsheets and some basic web reporting in Level 1, consolidated data marts in Level 2, Scorecards and a defined metrics framework in Level 3 and finally EDW and multi-dimensional drill-downs in Level 4.

Here is a comprehensive map of the Business Intelligence Maturity Model of an enterprise.


















The challenge is not to go from Level 1 (usually the case) to Level 4 - such an ambitious leap will kill you, and your people will not be ready. The trick is to deliver a tactical project which is not throwaway and create a foundation for the strategic vision.

 The steps to do this are not as formidable as they seem if you follow a good design process.

  1. Envisioning session with Executives to design the most effective reporting (ignoring current limitations)
  2. Create Enterprise Data Model of data elements (metadata)
  3. Logical mapping of all input sources to the EDM
  4. Create Data Warehouse schema with multi-dimensional modeling derived from the Data Dictionary and report requirements
  5. Create ETL Data Flow tasks to map input sources to the Data Warehouse
  6. Create web forms to capture data which are also mapped to the EDM
  7. Cleanse, merge, normalise data going into the data warehouse
  8. Design agreed reports in Report Builder
  9. QA and Release

Labels:

Thursday, April 8, 2010

SOA defined by what it is not

Once in a while a new concept comes along which is so important that everyone has heard of it, and almost no one in the industry can really afford not to have an opinion on it. This concept is a buzzword ringing out so loud that entire IT department directions have been carved out on its promise, and entire companies are founded to live and die by it.

Yet this concept does not lend itself easily to a sound-bite which, when you hear it, you know you don't have to read the book. No, it was not something invented in a flash, and no one jumped out of the bath in a Eureka moment having "discovered" it. Rather legions of thinkers "developed" the fullness of its dimensions through an evolutionary process based on "best practices" and perceptive insight.

In short it was wisdom of experience, not a law of physics. (No it is not just biologists who have Physics Envy - a hungry longing for an elegant fundamental law that describes a spectrum of natural phenomena which is true for all time).

All the more harder to define when this concept is an abstract Architectural pattern composed of finer-grained Design patterns. Especially hard when some of its goals seems congurent with an older Architectural pattern making it ripe for confusion; and when many old timers are affronted by each trend in the Hype Cycle in the industry feel they have seen it all before and that this latest new wave is simply “old wine in new bottles” or the best - “we are already doing this”.

I am talking about SOA.

At such times, it is often more helpful to illustrated this concept by what it is not.

Attached is a composite definition of the wisdom of IT sages - SOA. But I think more importantly - for those like me who understand faster and better by having something difficult explained to them what it is not - I have listed SOA Anti-patterns with the intention of illustrating “what as explained by what it should not be”.

What is Not SOA

Labels:

Monday, June 15, 2009

A Business Data Dictionary generated from XSD Schemas

With XSD schemas becoming the {default} Enterprise Data Dictionary at most companies, keeping schemas and the Business Data Dictionary synchronized is becoming a challenge. Business people must have a friendly, visually modeled, non-technical Data Dictionary to work with.

Tools which can generate documentation from Schemas (XSDs) are: DocFlex, Stylus Studio, xnsdoc, TechWriter, DocumentX, etc. However, all of them seem targeted at Technical folks and produce technical documentation (they generate namespaces, "AttributeGroups", "SimpleTypes" and "ComplexTypes" etc.) - that is gibberish to business folks and scares then away.

We need a Business Data Dictionary that is always in synch with the XSD and is visually modelled where the SSD terminology is represented by user friendly common language terms.

In the absence of available tools to generate them from a schema, most companies use Excel to maintain the data dictionary. However the tool to model and represent a schema is a treeview, not Excel. Secondly they often get out of synch due to the manual process involved and a few mistakes later, business folks start to distrust the manually maintained Data Dictionary


The best way to do this seems to embed xs:annotation/xs:documentation/xs:appInfo directly into the XSD and then use a {configurable} style sheet to generate a formatted tree-view Help (CHM / HHX / Framed HTML) which is a visual model of the XSD. This is how we can keep the XSDs and the Data Dictionaries always in synch (and not have to update any external Data Dictionary separately which often gets out of synch in a manual process).

Amazingly, there is no purchasable tool in the market which can do this - as far as my extensive research shows (I would be glad to be proven wrong). This is the 2nd time I ran into this in my career. I have explained the issue in the following attachment and am seeking developers who are skilled with XSD and XSLT to help me out with this project and also provide some insights.


SchemaModelDocumenter.doc

Labels:

Friday, February 6, 2009

Migrating BizTalk 2002 to 2006

Recently I was involved in migrating a BizTalk 2002 solution to 2006 for a large Candian customs and brokerage firm. With continued growth in the past few years and equally high growth forecast for future, the transaction load on the company's BizTalk Server 2002 environment had grown over many years
to a point where some transactions were taking close to 2 minutes to process. With future anticipated growth, it had become absolutely necessary that the environment be upgraded immediately, but the immediate need was to remove congestion and remove the single point of failure - which is the biggest Achilees heel of Biztalk 2002 - the lack of failover and clustering for redundancy.

BizTalk 2002, for those not familiar with it, is really a small subset of the functionality of what BizTalk 2006 today has grown to become. BizTalk Server 2006 is a significantly different product than BizTalk Server 2002. It not only provides a rich set of new features, but it provides new ways to do things. For example, imagine you have a business rule that must be invoked from numerous business processes. With BizTalk 2002, you can solve this challenge in one of two ways. You either implemented a decision shape in each orchestration or implemented the business rule in a custom object that is called from each orchestration. In BizTalk 2006, you may now choose to implement the business rule to execute within the Business Rules Engine that is called from each orchestration. The question then becomes, when I migrate this area of my BizTalk 2002 solution to BizTalk 2006, do I take advantage of the new Business Rules Engine?

The migration effort is riddled with issues like this. To help with the migration, I have attached a presentation comparing BizTalk 2002 and 2006 artifacts:

Comparision of BizTalk 2002 and 2006 artifacts

Labels: