Feeds:
Posts
Comments

This post was encouraged by similar writing about good data management post by fellow practitioner and blogger Henrik Liliendahl (Right the First time)

For data professionals like me, who have been working and preaching importance of the data quality/cleanliness and data management, it feels really good when you see good examples of enterprise information management policies and procedures at play in real life. It feels like the message of importance of data as an asset, as a “critical success enabler for the business” is being finally heard and accepted.

Recently I had a wonderful experience shopping for a laptop at http://www.dell.com. As I was shopping on their website, I configured my laptop (I’m sure all the options for my laptop were being populated from a Products MDM catalogJ). When I was ready to check out, before calculating shipping charges, website prompted me to enter my shipping address. When I entered my address, website came back to me with two corrected addresses which where enriched with additional information such as four digit extension of the zip code, expanded abbreviations etc. Website forced me to choose one of the corrected/enriched addresses before I proceeded with my order completion. This probably meant that they have implemented a solution which checks for validity and conformance of address information being entered before letting this information enter into their systems. Obviously, this investment from Dell has many benefits for Dell and hence they must have invested this effort in implementing these data quality/standardization solutions as a part of broader Enterprise Information Management framework. I was really happy for Dell. This process also meant that my order was delivered ahead of schedule without additional charge.

I am writing this because I believe in applauding and appreciating efforts done the right way. For transparency purpose, I am not related with dell.com in any professional way (employment, contract etc…), also nor did dell hire me to write this blog post. I am one of the thousands of customers they have. I just want to say good job Dell.com

I would like to appeal to all fellow bloggers and practitioners to cite examples of good information management, data management or data governance practices at work from public domain and write about them. Tweet about them under #goodeim tag. We have heard too many horror stories; there are many organizations which have been diligently at work implementing very successful information management practices, let us encourage and applaud those efforts openly.

Advertisements

This is a fourth blog entry in a series of blog entries highlighting how to go about securing executive sponsorship for data governance initiatives. In previous posts, I have highlighted the need for  understanding specific KPIs/metrics which executives track,  and tangible goals which are being set against those KPIs.

Almost always, there is either individual or group of individuals who work tirelessly on producing these necessary reports with KPIs/metrics for executives. Many a times these individuals have clear and precise understanding of how these metrics/KPIs are calculated, what data issues, if any, exists in underlying data which supports these metrics.

It is worthwhile to spend time with these groups of people to get a leg up on an understanding of metrics/KPI definitions, knowledge around data issues (data quality, consistency, system of record). The process of engaging these individuals will also help in winning confidence of the people who know the actual details around KPI/metrics, processes associated with calculating and reporting on these metrics. These individuals will likely to be part of your data governance team and are crucial players in winning the vote of confidence from executives as it relates to the value data governance initiatives create.

In one of my engagements with a B2 B customer, executive management had the goal of improving business with existing customers. Hence executive management wanted to track Net new versus Repeat business. Initially sales operations team had no way of reporting on this KPI, so in the early days they reported using statistical sampling. Ultimately, they created a field in their CRM system to capture new or repeat business on their opportunity data. This field was used for doing new versus repeat business KPI reporting. Unfortunately, this field was always entered manually by a sales rep while creating opportunity record. While sales operation team knew that this is not entirely accurate, they had no way of getting around it.

In my early discussions with sales operations team, when I came to know about this, I did a quick assessment for a quarter worth of data. After doing basic de-duping and some cleansing I compared my numbers versus their numbers and there was a significant difference between both of our numbers. This really helped me get sales operations team on board with data cleansing and ultimately data governance around opportunity, customers and prospects data. This discussion/interaction also helped us clearly define what business should be considered Net new and Repeat business.

Obviously, as one goes through this process of collecting information around metrics, underlying data and the process by which these numbers are crunched, it helps to have proper tools and technology in place to capture this knowledge. For example

a)     Capturing definition of metrics

b)     Capturing metadata around data sources

c)      Lineage, actual calculations behind metrics etc.

This process of capturing definitions, metadata, lineage etc. will help in getting high level visibility of the scope of things to come. Metadata and lineage can be used to identify business processes and systems which are impacting KPIs.

In summary, this process of finding people behind the operations of putting together KPIs helps in identifying subject matter experts who can give you clear and high-value pointers around “areas” which data governance initiatives need to focus early on in the process. This process will ultimately help you in recruiting people with right skill set and knowledge in your cross functional data governance team.

Previous posts on this or related topics:

Litmus Test for Data Governance Initiatives: What do you need to do to garner executive sponsorship?

Data Governance Litmus Test: Know thy KPIs

Data Governance Litmus Test: Know goals behind KPIs

Suggested next posts:

Data Governance Litmus Test: Do You Have Access to the Artifacts Used by Executives?

This is the third blog post in a series of blog posts geared towards addressing “Why, What and How?” of getting executive sponsorship for data governance initiatives. In my last post Data Governance Litmus Test: Know thy KPIs I explored importance of knowing KPIs to be able to build link between data governance initiatives outcomes and the organizational strategy. In this post I’m going to explore why it is important to know specific goals of the KPIs which are monitored on periodic basis by executives towards fulfilling organizational strategy.

Data governance initiatives typically will span multiple organizations, key business processes, heterogeneous systems/applications and several people from different lines of businesses. Any time when one is dealing with such a complex composition of players and stakeholders, it is extremely crucial to be articulate about business goals and the impact of the actions on hand on the goals. Once people understand the magnitude of impact, and how they will be responsible for such an impact, getting their co-operation, alignment becomes relatively easy.

Once you understand the KPIs which are important organizationally, you need to drill down one level below to understand what specific goals are important? The process of understanding specific goals will undoubtedly reveal many contributing factors to the fulfillment of the overall goals.

For example:

If one of the major KPIs which executives are tracking is overall spend. At this stage it is important for the data governance initiative team to understand specific goals around this KPI. For example the specifics goals around this KPI could be:

1.     Chief procurement officer has been asked to reduce spend by 2% within four quarters

2.     2% reduction across the board represents $80 million savings.

3.     This savings alone would allow organization to improve its profitability by almost a penny per share. This ultimately will reflect positively in share price improvement and will benefit all the employees of the organization.

Once such details are known, establishing a dialogue with chief procurement officer and his/her key advisers might further reveal that

1.     Their focus is going to be in three specific areas (specific products/raw materials)

2.     Not having singular view of suppliers is a key concern. Because of this issue they are not able to negotiate consistent pricing contracts with the suppliers. They believe that streamlining contracts based on overall spend with suppliers; their subsidiaries will help them achieve more than 70% of their goal.

3.     Supplier contracts are not being returned consistently resulting in higher costs in terms of minimum business guarantees and price point guarantees.

Equipped with this information, it will be much easier for data governance team to highlight and link their efforts to overall goal of reducing spend. For example, with some of this information gathered, one can already pinpoint that teams which are working with suppliers/supplier development, contract negotiations, pricing etc…. are going to be critical to get on board data governance with this initiative. Also, it is clear from these nuggets of information that the overall spend, number of suppliers, number of materials/products being procured will be some of the key metrics and interrelationship between those metrics will be critical to link any ROI from initiatives to clean supplier data, build supplier MDM etc…

With this information data governance team now can not only communicate to their team members but also the executives, that X percent of duplicate data in supplier master would potentially represent Y dollars off excessive spend. Data governance team will be able to explain not only how this can be fixed but what is required to maintain this hygiene on an ongoing basis because of the impact it will have on overall excess spend.

In summary, it is really important to understand the goals behind “what?” of the organizational strategy. Other indirect benefits of this kind of exercise are

1.     Establish communication and contacts with the business stakeholders.

2.     Understand areas where you can focus upfront for the highest impact.

3.     Understand and learn the language which you could use to effectively communicate ROI of data governance back to the executives.

In my next post, I will explore who is behind putting together these KPIs for executive in the current situation. These people are ‘the most critical’ players in the Data Governance team at both execution and implementation levels as the initiatives are kicked off.

Previous posts on this or related topics:

Litmus Test for Data Governance Initiatives: What do you need to do to garner executive sponsorship?

Data Governance Litmus Test: Know thy KPIs

Suggested next posts:

Data Governance Litmus Test: How and who is putting together metrics/KPIs for executives?

Data Governance Litmus Test: Do You Have Access to the Artifacts Used by Executives?

In last post about Data Governance Litmus Test, I outlined 10 questions which could be used as a litmus test to figure out, how you are doing in terms of garnering executive sponsorship for your data governance initiatives? In this post, I’m going to explore why it is important to know organizational KPI’s/initiatives before and during data governance initiatives.

KPI’s or metrics which are looked at by CXO’s are the clear-cut indicators of where the organizational focus is from the perspective of operational excellence. They also serve as an early indicator of overall organizational strategy. Knowing these KPI’s firsthand helps the teams involved in data governance initiatives in internalizing what is important for the organizational strategy? Also it helps with understanding where executive focus is within the organization?

Many times when I asked this question (which KPI’s are being tracked by executive management) to the teams working on data governance initiatives, I typically get standard answers. “Our executives are looking at sales, cost related KPI’s.” This is a clear indication that the team has not made significant effort in understanding the KPI’s, establishing communication channel with executive management and has not emphasized the need for understanding KPI’s by the data governance team.

While ultimately the goal of organization is to increase revenues, minimize the cost and maximize profitability, there are several steps and ways by which these goals are achieved. From marketing, procurement, finance to sales there are specific goals which are set as a part of achieving business plan and these goals are tracked by executive management team on a periodic basis. Many a times these goals will change from time to time to adjust for change in strategy as well as changes in the overall goals. Understanding the details of the KPI’s across different parts of the organization helps data governance teams to link their activities to specific KPI’s and results associated with those KPI’s.

The process of getting engaged with executive management and make a case to understand KPI’s in detail helps in multiple ways to the data governance initiative:

1.     It helps with establishing communication channel, credibility, relationship with executive management and their goals/mission.

2.     It gives the team visibility into very specific KPI’s which are important for organizational growth, growth of individual executives within the organization.

3.     It helps create the context to the data governance discussion, change management process across the entire organization. No one can dispute the need/requirement for the reporting and improving these KPI’s.

4.     Once you establish a communication channel/relationship with executives around these KPI’s, and if you are able to demonstrate the value you and the initiative which you are proposing(data governance) can add to the KPIs, executives will get in the habit of involving data governance team as and when either KPI’s change or there are issues with reporting KPI’s.

5.     The confidence and trust relationship which you can build through this exercise will make it easy to ask for executive sponsorship. Executives will be more than willing to support your initiatives as they see a clear line connecting data governance  initiatives with their KPI’s and progress.

The process of getting to know these KPI’s is important one. When understanding the KPI’s or collecting information about these KPI’s, it is important to collect significant details around KPI’s:

1.     Name of the KPIs

2.     How executives are defining these KPI’s, that is in executives mind how this KPI is measured and calculated

3.     Understand from executive perspective, which business processes impact/influence this KPI, which roles and possibly names of the people will have the most influence on the outcome of this KPI.

4.     Periodicity: how often is this KPI reported on?

5.     Establish clear linkage between this KPI and a specific organizational strategy ultimately rolling up into the vision leadership has created for the organization.

6.     It may be beneficial to understand how these KPI’s will help executives in achieving their personal goals

As always, devil is in details. If CFOs goal is to reduce DSO, then being able to understand from CFO’s perspective how DSO is impacted by collection processes, CRM processes is important. For all you know unclean addresses might be at the root of lack of ability to collect the payments (at least one of the reasons behind larger DSO number). If you followed recommendations above you will be able to tangibly demonstrate linkage between cleanliness issue and DSO and will be able to garner support from CFO on this issue on a ongoing basis.

At this stage I am not focusing on specific technology investments, but as you can see any technology solution which will allow you to capture strategy, KPI’s and link business processes to these artifacts will be a good solution to capture this information.

In my next post around the litmus test questions, I will explore the need for understanding the specific goals around these KPI’s.

Previous relevant posts:

Litmus Test for Data Governance Initiatives: What do you need to garner executive sponsorship?

Suggested reading next:

Data Governance Litmus Test: Know goals behind KPIs

Data Governance Litmus Test: How and who is putting together metrics/KPIs for executives?

Data Governance Litmus Test: Do You Have Access to the Artifacts Used by Executives?

It was a long day in Cincinnati, we had full day of conference, demos and discussions around data governance/data quality topics. Some of my colleagues, friends in the industry decided to retire in a bowling alley. Over the bowling game and few beers we obviously resorted to talking about the same topic we have been discussing all throughout the day. After about couple of hours of discussion lane #7, and#8 and #9 came to same conclusion: one of the toughest parts of data governance initiatives is ongoing executive sponsorship and the need for demonstrating tangible ROI.

On my flight way back home I jotted down some thoughts around this topic and thought of creating a basic list of questions which could be used as a litmus test to validate, if all the right steps are taken to ensure ongoing executive sponsorship and tangible ROI proof points for data governance initiative.

Everybody who is involved in some sort of data governance initiative knows the criticality and the importance of having executive sponsorship for the overall success and viability of the data governance programs.

It is really important to have the right level of understanding about organizational goals and drivers. With the specific knowledge of organizational initiatives it is much easier to link data governance initiative to specific organizational goals/drivers. Creating this link between goals and data governance will help in creating the necessary ROI case for data governance as well as garner the executive sponsorship.

So here is the list of the 10 simple yet relevant questions which I am proposing every data governance team should use as a litmus test from time to time to validate, if they are going on the right track to ensure ongoing executive sponsorship and capacity to demonstrate tangible ROI to the organization.

1.       Every Monday morning CEO and his direct reports meet to review organizational KPI’s. Do you precisely know which metrics are being looked at on a weekly basis?

2.       Do you know what the goals are for those KPIs?

3.       Do you know how each of those metrics/KPIs is put together and by whom?

4.       Do you know which KPIs are not meeting their desired goals?

5.       Do you have sample presentation or a report which all of the executives look at, in the Monday morning meeting?

6.       Once you have an idea about the key metrics, people who put together those metrics for executives, do you know which systems are responsible for generating and managing raw data which is required for those metrics?

7.       Do you have an understanding of the quality, reliability, timeliness of the data which is being used to put together those metrics?

8.       Have you found issues with data quality, reliability, timeliness of the data or how the data is managed on ongoing basis?

9.       Have you shared you are findings of the quality and reliability of the raw data which is being used to put together weekly KPIs with the executives which are responsible for those KPIs?

10.   Have you reached any common understanding regarding the need to address data quality, reliability, timeliness or issues around how the data is being managed with the key executives whose KPI’s are being impacted because of the underlying issues associated with the data? And benefits of such actions/initiatives?

If you answered yes to all of the questions above, you are well on your way to generate tangible ROI, Garner executive sponsorship for your data governance initiatives. And you have very high chance of being successful at achieving all of your goals of data governance initiative.

On the other hand, if you did not answer yes to one or many of the questions above, it is time to go back to the whiteboard and understand how truly you have been able to justify the ROI? How realistic is that case? And do you truly have executive sponsorship and support to your data governance initiatives?

I would love to hear your thoughts around this topic, and in specific if you would add any more questions? Take away any of the above questions? Or simplify any of the questions?

In my opinion, how and when you ask these questions and take appropriate actions will differ based on where in the life cycle of the project you are. In future posts, I will discuss relevancy of this litmus test and other factors influencing your actions based on this litmus test for two scenarios:

1.       For teams who are just in the beginning phases of proposing Data governance program, but yet not started.

2.       Teams who are already working through the Data governance programs.

Suggested reading next:

Data Governance Litmus Test: Know thy KPIs

Data Governance Litmus Test: Know goals behind KPIs

Data Governance Litmus Test: How and who is putting together metrics/KPIs for executives?

Data Governance Litmus Test: Do You Have Access to the Artifacts Used by Executives?

There were many predictions in the Software industry for 2010. One of Industry thought leaders Nenshad Bardoliwalla  had his predictions in the area of “Trends in Analytics, BI and Performance Management.” His predictions about how vendors will have packaged/strategy driven execution applications, slice and dice capabilities from the BI vendors returning to its decision centric roots and advance visualization capabilities got me thinking about my favorite topic about purpose built Applications.

What is a purpose driven/built analytic application (PDAA) after all? It is an analytic application which addresses a much focused business area or process and provides insight into the opportunities (for improvements), challenges (performance). In order for such analytic application to provide insight…

  1. It needs to be designed for a specific purpose (or problem in mind), and that purpose or focus really needs to be narrow (to be able to provide holistic insight)
  2. It needs to rely on purpose built visualization and needs to use Web 2.0 style technologies to make analytic insight pervasive (some examples to follow)
  3. It needs to provide descriptive, prescriptive and predictive capabilities to provide holistic insight
    1. Descriptive capabilities will provide view into state of current affairs
    2. Prescriptive capabilities will provide what users need to focus on as a follow up, it also helps in guiding users as to what questions they should ask next to build the holistic insight
    3. Predictive capabilities will facilitate what if analysis and provide insight into what situation business might expect should the current situation continue.
    4. It implicitly provides users with what questions users should ask in a given situation and provides either complete answers or data points leading up to those answers…

Many a times, because of the very specific purpose and narrow focus, most of the insights provided by purpose built analytic applications can be manifested right into the operational application via purpose built gadgets or even purpose built controls. Single dashboard with a interactivity around the widgets/gadgets in the dashboard will typically provide complete insight into the focus/purpose of the analytic application.

Let us discuss an example of what a purpose built analytic application could be…Every organization which has sales force actively selling products/services of the organization has a weekly call to review the pipeline. This is typically done region by region basis and the data is then rolled up at a global level.  A purpose driven analytic application in this situation would be “Weekly Pipeline Review” application. In this application rather than providing free form slicing dicing/reporting capabilities around pipeline data (which will be traditional way), this type of application will focus on:

  1. Current Pipeline
  2. Changes to the pipeline from last week (positive, negative: As this is what is really watched closed in this call to make sure forecast numbers can be achieved)
  3. Indicate impact of the changes to the pipeline on achieving goals/forecast. Based on these changes, extrapolate the impact on Sales Organizations plans…. (what-if)
  4. Provide visibility into deals which might be problematic based on the past performance and heuristics (this is what I call prescriptive)
  5. Provide visibility into deals which are likely to move faster and close faster, again based on past performance. (Again prescriptive)
  6. Provide account names in which incremental up sell can be done (again based on past performance in similar accounts) but there are no active deals/opportunities etc…
  7. Provide visibility into individuals and regions which are at risk of missing their forecast based on their past and current performance.

There are different visualizations which can be used to build such type of application. Focus of this analytic application is to help Sales VP’s and Sales Operations to get through weekly pipeline review call quickly by focusing on exceptions (both on the positive and negative side) and provide full insight into the impact of the changes, areas which they should focus into etc…. Hopefully this explains in detail the difference between purposes built analytic application vs. traditional data warehouse or traditional analytic application.

Let us now briefly look at how purpose built UI supports some of the important aspects (holistic insight) of the purpose built applications. Many of you have used Google portal and have uploaded iGoogle gadgets. One can look at iGoogle gadgets as purpose built applications which focus on one specific area of interest to you.  Take a look at one of the samples put together by Pallav Nadhani to demonstrate Fusion charts visualizations. This gadget is a perfect example of how purpose built UI helps in creating the focus and holistic insight of the analytic Application. This gadget provides complete Weather picture for a location for today or for future.

 There is a company out of New Zealand, Sonar6 which provides product solution around performance management (much focused, purpose driven)/Talent Management. They have done fantastic job of building purpose built application and delivered that application through purpose built UI. I especially like the way they have provided analytic and reporting capabilities (helicopter view) around performance management. You can register for their demo or can look at their brochure/PowerPoint presentations.

There are several other vendors who have made purpose built analytics pervasive in our day to day lives. Recommendation engine built by Amazon is a perfect example of “Purpose built Analytics”

In the end, I truly believe that purpose built analytic applications can and will maximize the value/insight delivered to the end users/customers while keeping the focus of the analytics narrow.

I would love to know your thought around purpose built applications. What has been your experience?