Feeds:
Posts
Comments

Archive for the ‘Analytics’ Category

In his recent blog titled Data and Process Transparency Jim Harris makes the case that “a more proactive approach to data quality begins with data and process transparency”. This is very true of any organization striving towards availability of highest quality data for its decision-making as well as efficiency of business processes.

In order to embed transparency in every situation across organization, organizations need to be data driven. After all, what does it mean to be data driven organization, you may ask? There is great deal of literature around this topic in print as well as on the web. I will try to simplify this discussion and say that, to me, when culture of decision making is purely based on the factual data (KPIs/Metrics etc…) within an organization (and not on gut feel, emotions and subjectivity of decision making individuals), organization can be said to have become a data driven organization. Of course, this is a very simplistic definition (just for the purpose of this blog).

Many a times, depending upon organizational maturity you may have organizations which are completely data driven versus organizations which are more mature in one area (vis-a -vis data driven decision-making) versus other areas of the organization. For example, in some cases finance side of the organization might be much more data driven than either marketing or sales site etc.

Using data to make decisions drives both data and process transparency across organization. It discourages use of anecdotal information (and gut feel) and forces people to think in terms of realistic data and evidence presented by data. Also using specific KPI’s/metrics allows organizations to clearly define issues associated with underlying data or business processes more readily.

For example, if the sales operation team is discussing order return rates, they cannot simply say that we have a very low order return rate because of poor addresses in a “data driven organization”. They will say that they have 1% order return rate for (on average) 125,000 orders they ship every month because of the poor shipping addresses. This way of expressing performance not only helps everyone involved in understanding the importance of good data quality but also helps organization with creating sensitivity around capturing good data to begin with. Also expressing performance this way helps with ready-made business case for supporting underlying data management initiatives.

Transforming organizations to a data driven organization is a gargantuan change management task. It requires significant cultural/thinking change up and down the organizational hierarchies. During such transformations, organizational operational DNA is completely changed. Obviously, the benefits and rewards of being the data driven organization are immense and worth the efforts of transformation.

On the other side, during the data driven organizational transformation if organizations find that data is not of reliable quality, this finding will force data management discussion across organization and help kick start initiatives to fix the data as more and more in the organization start using data for decision making.

In end, I would encourage everyone to be as data driven as possible in their decision making and influence areas within your organization to be data driven. As data professionals, this will allow us to be more proactive in addressing data management challenges for the organization.

Advertisement

Read Full Post »

Yesterday as I was driving to work, we had fog everywhere in the area in which I live. We where fogged in so to speak. Typical commute from my house to the nearest freeway takes about 10 minutes on a given day, yesterday, it took 25 minutes. Visibility was poor; I could hardly see more than 100 yard ahead of me and about the same distance behind me. This meant I was driving very cautiously; not at all confident about what was ahead of me. While I was driving through this dense fog a thought came to my mind, isn’t it true that business decision makers go through similar predicament when they are faced with an lack of availability of reliable, high quality data for decision-making?

Poor data quality means lesser visibility into performance of the organization; it also implies impairment of decision-making based on actual data. As with a fog, poor data quality means business decisions are done slowly, over cautiously and many a times based on gut feel, rather than factual data. Slowness in decision-making could mean possible loss of the edge business has over its competition. I feel that there is a lot common between driving through a fog and trying to run the business with poor quality data.

As sun rises and temperature increases, fog will burn out. In the same way effective data quality and data governance initiatives will help burn away the fog created by a lackluster data quality. Burning off the fog is a slow and steady process; all the right conditions need to exist before fog disappears. It is the same with addressing data quality holistically within enterprise. Right conditions need to be created in terms of executive sponsorship, understanding of importance of good data quality, clear demonstration of value created by data assets etc. before true fruits of data quality initiatives can be harvested.

Superior data quality and timeliness of availability of high-quality data has significant impact on day to day business operations as well as strategic initiatives business undertakes.

Read Full Post »

This is the third blog post in a series of blog posts geared towards addressing “Why, What and How?” of getting executive sponsorship for data governance initiatives. In my last post Data Governance Litmus Test: Know thy KPIs I explored importance of knowing KPIs to be able to build link between data governance initiatives outcomes and the organizational strategy. In this post I’m going to explore why it is important to know specific goals of the KPIs which are monitored on periodic basis by executives towards fulfilling organizational strategy.

Data governance initiatives typically will span multiple organizations, key business processes, heterogeneous systems/applications and several people from different lines of businesses. Any time when one is dealing with such a complex composition of players and stakeholders, it is extremely crucial to be articulate about business goals and the impact of the actions on hand on the goals. Once people understand the magnitude of impact, and how they will be responsible for such an impact, getting their co-operation, alignment becomes relatively easy.

Once you understand the KPIs which are important organizationally, you need to drill down one level below to understand what specific goals are important? The process of understanding specific goals will undoubtedly reveal many contributing factors to the fulfillment of the overall goals.

For example:

If one of the major KPIs which executives are tracking is overall spend. At this stage it is important for the data governance initiative team to understand specific goals around this KPI. For example the specifics goals around this KPI could be:

1.     Chief procurement officer has been asked to reduce spend by 2% within four quarters

2.     2% reduction across the board represents $80 million savings.

3.     This savings alone would allow organization to improve its profitability by almost a penny per share. This ultimately will reflect positively in share price improvement and will benefit all the employees of the organization.

Once such details are known, establishing a dialogue with chief procurement officer and his/her key advisers might further reveal that

1.     Their focus is going to be in three specific areas (specific products/raw materials)

2.     Not having singular view of suppliers is a key concern. Because of this issue they are not able to negotiate consistent pricing contracts with the suppliers. They believe that streamlining contracts based on overall spend with suppliers; their subsidiaries will help them achieve more than 70% of their goal.

3.     Supplier contracts are not being returned consistently resulting in higher costs in terms of minimum business guarantees and price point guarantees.

Equipped with this information, it will be much easier for data governance team to highlight and link their efforts to overall goal of reducing spend. For example, with some of this information gathered, one can already pinpoint that teams which are working with suppliers/supplier development, contract negotiations, pricing etc…. are going to be critical to get on board data governance with this initiative. Also, it is clear from these nuggets of information that the overall spend, number of suppliers, number of materials/products being procured will be some of the key metrics and interrelationship between those metrics will be critical to link any ROI from initiatives to clean supplier data, build supplier MDM etc…

With this information data governance team now can not only communicate to their team members but also the executives, that X percent of duplicate data in supplier master would potentially represent Y dollars off excessive spend. Data governance team will be able to explain not only how this can be fixed but what is required to maintain this hygiene on an ongoing basis because of the impact it will have on overall excess spend.

In summary, it is really important to understand the goals behind “what?” of the organizational strategy. Other indirect benefits of this kind of exercise are

1.     Establish communication and contacts with the business stakeholders.

2.     Understand areas where you can focus upfront for the highest impact.

3.     Understand and learn the language which you could use to effectively communicate ROI of data governance back to the executives.

In my next post, I will explore who is behind putting together these KPIs for executive in the current situation. These people are ‘the most critical’ players in the Data Governance team at both execution and implementation levels as the initiatives are kicked off.

Previous posts on this or related topics:

Litmus Test for Data Governance Initiatives: What do you need to do to garner executive sponsorship?

Data Governance Litmus Test: Know thy KPIs

Suggested next posts:

Data Governance Litmus Test: How and who is putting together metrics/KPIs for executives?

Data Governance Litmus Test: Do You Have Access to the Artifacts Used by Executives?

Read Full Post »

In last post about Data Governance Litmus Test, I outlined 10 questions which could be used as a litmus test to figure out, how you are doing in terms of garnering executive sponsorship for your data governance initiatives? In this post, I’m going to explore why it is important to know organizational KPI’s/initiatives before and during data governance initiatives.

KPI’s or metrics which are looked at by CXO’s are the clear-cut indicators of where the organizational focus is from the perspective of operational excellence. They also serve as an early indicator of overall organizational strategy. Knowing these KPI’s firsthand helps the teams involved in data governance initiatives in internalizing what is important for the organizational strategy? Also it helps with understanding where executive focus is within the organization?

Many times when I asked this question (which KPI’s are being tracked by executive management) to the teams working on data governance initiatives, I typically get standard answers. “Our executives are looking at sales, cost related KPI’s.” This is a clear indication that the team has not made significant effort in understanding the KPI’s, establishing communication channel with executive management and has not emphasized the need for understanding KPI’s by the data governance team.

While ultimately the goal of organization is to increase revenues, minimize the cost and maximize profitability, there are several steps and ways by which these goals are achieved. From marketing, procurement, finance to sales there are specific goals which are set as a part of achieving business plan and these goals are tracked by executive management team on a periodic basis. Many a times these goals will change from time to time to adjust for change in strategy as well as changes in the overall goals. Understanding the details of the KPI’s across different parts of the organization helps data governance teams to link their activities to specific KPI’s and results associated with those KPI’s.

The process of getting engaged with executive management and make a case to understand KPI’s in detail helps in multiple ways to the data governance initiative:

1.     It helps with establishing communication channel, credibility, relationship with executive management and their goals/mission.

2.     It gives the team visibility into very specific KPI’s which are important for organizational growth, growth of individual executives within the organization.

3.     It helps create the context to the data governance discussion, change management process across the entire organization. No one can dispute the need/requirement for the reporting and improving these KPI’s.

4.     Once you establish a communication channel/relationship with executives around these KPI’s, and if you are able to demonstrate the value you and the initiative which you are proposing(data governance) can add to the KPIs, executives will get in the habit of involving data governance team as and when either KPI’s change or there are issues with reporting KPI’s.

5.     The confidence and trust relationship which you can build through this exercise will make it easy to ask for executive sponsorship. Executives will be more than willing to support your initiatives as they see a clear line connecting data governance  initiatives with their KPI’s and progress.

The process of getting to know these KPI’s is important one. When understanding the KPI’s or collecting information about these KPI’s, it is important to collect significant details around KPI’s:

1.     Name of the KPIs

2.     How executives are defining these KPI’s, that is in executives mind how this KPI is measured and calculated

3.     Understand from executive perspective, which business processes impact/influence this KPI, which roles and possibly names of the people will have the most influence on the outcome of this KPI.

4.     Periodicity: how often is this KPI reported on?

5.     Establish clear linkage between this KPI and a specific organizational strategy ultimately rolling up into the vision leadership has created for the organization.

6.     It may be beneficial to understand how these KPI’s will help executives in achieving their personal goals

As always, devil is in details. If CFOs goal is to reduce DSO, then being able to understand from CFO’s perspective how DSO is impacted by collection processes, CRM processes is important. For all you know unclean addresses might be at the root of lack of ability to collect the payments (at least one of the reasons behind larger DSO number). If you followed recommendations above you will be able to tangibly demonstrate linkage between cleanliness issue and DSO and will be able to garner support from CFO on this issue on a ongoing basis.

At this stage I am not focusing on specific technology investments, but as you can see any technology solution which will allow you to capture strategy, KPI’s and link business processes to these artifacts will be a good solution to capture this information.

In my next post around the litmus test questions, I will explore the need for understanding the specific goals around these KPI’s.

Previous relevant posts:

Litmus Test for Data Governance Initiatives: What do you need to garner executive sponsorship?

Suggested reading next:

Data Governance Litmus Test: Know goals behind KPIs

Data Governance Litmus Test: How and who is putting together metrics/KPIs for executives?

Data Governance Litmus Test: Do You Have Access to the Artifacts Used by Executives?

Read Full Post »

There were many predictions in the Software industry for 2010. One of Industry thought leaders Nenshad Bardoliwalla  had his predictions in the area of “Trends in Analytics, BI and Performance Management.” His predictions about how vendors will have packaged/strategy driven execution applications, slice and dice capabilities from the BI vendors returning to its decision centric roots and advance visualization capabilities got me thinking about my favorite topic about purpose built Applications.

What is a purpose driven/built analytic application (PDAA) after all? It is an analytic application which addresses a much focused business area or process and provides insight into the opportunities (for improvements), challenges (performance). In order for such analytic application to provide insight…

  1. It needs to be designed for a specific purpose (or problem in mind), and that purpose or focus really needs to be narrow (to be able to provide holistic insight)
  2. It needs to rely on purpose built visualization and needs to use Web 2.0 style technologies to make analytic insight pervasive (some examples to follow)
  3. It needs to provide descriptive, prescriptive and predictive capabilities to provide holistic insight
    1. Descriptive capabilities will provide view into state of current affairs
    2. Prescriptive capabilities will provide what users need to focus on as a follow up, it also helps in guiding users as to what questions they should ask next to build the holistic insight
    3. Predictive capabilities will facilitate what if analysis and provide insight into what situation business might expect should the current situation continue.
    4. It implicitly provides users with what questions users should ask in a given situation and provides either complete answers or data points leading up to those answers…

Many a times, because of the very specific purpose and narrow focus, most of the insights provided by purpose built analytic applications can be manifested right into the operational application via purpose built gadgets or even purpose built controls. Single dashboard with a interactivity around the widgets/gadgets in the dashboard will typically provide complete insight into the focus/purpose of the analytic application.

Let us discuss an example of what a purpose built analytic application could be…Every organization which has sales force actively selling products/services of the organization has a weekly call to review the pipeline. This is typically done region by region basis and the data is then rolled up at a global level.  A purpose driven analytic application in this situation would be “Weekly Pipeline Review” application. In this application rather than providing free form slicing dicing/reporting capabilities around pipeline data (which will be traditional way), this type of application will focus on:

  1. Current Pipeline
  2. Changes to the pipeline from last week (positive, negative: As this is what is really watched closed in this call to make sure forecast numbers can be achieved)
  3. Indicate impact of the changes to the pipeline on achieving goals/forecast. Based on these changes, extrapolate the impact on Sales Organizations plans…. (what-if)
  4. Provide visibility into deals which might be problematic based on the past performance and heuristics (this is what I call prescriptive)
  5. Provide visibility into deals which are likely to move faster and close faster, again based on past performance. (Again prescriptive)
  6. Provide account names in which incremental up sell can be done (again based on past performance in similar accounts) but there are no active deals/opportunities etc…
  7. Provide visibility into individuals and regions which are at risk of missing their forecast based on their past and current performance.

There are different visualizations which can be used to build such type of application. Focus of this analytic application is to help Sales VP’s and Sales Operations to get through weekly pipeline review call quickly by focusing on exceptions (both on the positive and negative side) and provide full insight into the impact of the changes, areas which they should focus into etc…. Hopefully this explains in detail the difference between purposes built analytic application vs. traditional data warehouse or traditional analytic application.

Let us now briefly look at how purpose built UI supports some of the important aspects (holistic insight) of the purpose built applications. Many of you have used Google portal and have uploaded iGoogle gadgets. One can look at iGoogle gadgets as purpose built applications which focus on one specific area of interest to you.  Take a look at one of the samples put together by Pallav Nadhani to demonstrate Fusion charts visualizations. This gadget is a perfect example of how purpose built UI helps in creating the focus and holistic insight of the analytic Application. This gadget provides complete Weather picture for a location for today or for future.

 There is a company out of New Zealand, Sonar6 which provides product solution around performance management (much focused, purpose driven)/Talent Management. They have done fantastic job of building purpose built application and delivered that application through purpose built UI. I especially like the way they have provided analytic and reporting capabilities (helicopter view) around performance management. You can register for their demo or can look at their brochure/PowerPoint presentations.

There are several other vendors who have made purpose built analytics pervasive in our day to day lives. Recommendation engine built by Amazon is a perfect example of “Purpose built Analytics”

In the end, I truly believe that purpose built analytic applications can and will maximize the value/insight delivered to the end users/customers while keeping the focus of the analytics narrow.

I would love to know your thought around purpose built applications. What has been your experience?

Read Full Post »

Read Full Post »

Score Card for Prioratizing Data Quality Issues

Couple of weeks back I was having conversation with a fellow CTO, he was demonstrating analytics product to me.  There were many instances in the demo (dashboards/reports) where there was lot of dimensional data were missing (for example industry verticals, Product category etc…). Obviously during the course of the demo, discussion around data quality broke out. Fellow CTO mentioned that they do not encourage customers to spend time and energy to fix data quality issues from the analytics prospective if numbers around data quality issues represent  less than 1% of overall $ numbers.  I kind of agreed with his argument and justification (again from directional analytics prospective) around not fixing these data quality issues because:

1) These issues do not interfere with analysis if analysis hinges upon directionality of the business.

2) ROI from fixing these issues is not significant as the data represented by these issues will have less than 1% impact on the directionality of the analysis (which is statistically insignificant).

This got me thinking that data quality is truly multi-dimensional problem (like the story about an elephant in the room and  blindfolded men describing the elephant, everyone concludes it as a different object even though everyone is feeling the elephant).  As data quality professionals, it is important for all of us to bring that prospective in any data quality initiative. Best way to doing this would be to build a data quality score card with the quality assessment and its impact on the context in which data will be used. This type of score card can and should be used in prioritizing fixing of data quality issues. This will also help in justifying ROI of the data quality issues.

 As indicated in chart, each context is analyzed from the prospective of data quality attributes. Each context is given Red, Green or Yellow indicator. Obviously any red indicators need to be addressed before data can be used in that context. In this example, it helps to demonstrate that compliance reporting requirements cannot be met, until data quality issues associated with credit ratings, address data quality are completely resolved. This helps with demonstrating the need, necessary ROI and helps in prioritization of which attributes to be addressed first.

I would love to hear from you as to how did you prioritize and justified quality imitative, what tools/techniques you used?

Read Full Post »

We have all been in the situation where applications/software we implement gets shot at because of the poor data quality which gets shown to the end users through those applications. In most situations it is really easy to shoot the messenger (Application/software in this case) rather than address the root cause behind the message (poor data quality). Time in and time again we have experienced shooting of the messenger by everyone in the business (because it is easy to do but really not good for the outcome it brings about for the business). In this discussion, I am going to talk about “what to do to avoid getting into trap of data quality issues when BI projects/initiatives are undertaken?”

BI (I am using BI or Business Intelligence as term in a broad sense to represent all reporting, data warehousing/datamarting or analytics initiatives in the organization) is often a catalyst in bringing data /process quality issues to forefront really fast and easily. Given the visibility and pervasive value BI brings to the organizational informational needs, in cases when data quality issues are brought to forefront, suddenly debate ensues if the focus should solely be shifted on fixing data quality issues and putting BI initiatives on back burner until data quality issues are fixed. This approach/organizational attitude(I have experienced this more so in SMB customers) has some risks/issues associated with it (as mentioned below), and BI teams should avoid this trap by being proactive.

1. Belief that data quality can be completely fixed, creates unreasonable expectations and sets the whole team (which is likely to work on fixing data quality) up for failure. Data quality issues will exist as long as data is created and used by people. Citing data quality as a reason to defer BI implementation will deprive organization of the Informational needs which could be met by BI. Remediation of data quality is an iterative process (and not a one shot deal) and this is discussed in his recent post by Jim Harris in “Missed It By That Much

2. BI initiatives will always highlight possible data quality and process issues, making data quality as one of the major contingencies to derive value from BI initiatives is short sightedness (it is not as black and white as it is made to sound by people looking at what is wrong because of data quality issues)  and will not help organizations derive proper ROI and edge from the BI investments.

So what do you do when data quality issues threaten to derail the BI initiatives?

Proactive: BI initiatives are driven by organizations for the insight they offer into the data for making operational as well as strategic decisions. Because BI initiatives make it very easy to look at both aggregated and detail level data across the organization, they often highlight many data quality and process related issues within the business. It is really important to highlight this side benefit of BI implementation to all stakeholders’ right from beginning and all throughout BI initiative:

1. BI initiatives will make it very easy to spot data quality issues, use artifacts from BI initiatives as your microscope to find data quality issues.
2. They will highlight process issues by showing gaps or lack of relationships in data
3. BI initiatives can be and should be used to benchmark and monitor ongoing data quality and enforcement of business processes.

In fact, including these benefits of BI into the ROI calculation will ensure that

1. When data quality issues are found, organization will not be surprised. It’s almost like they are expecting BI initiative to highlight Data/Process quality issues.
2. BI initiatives will get credit (towards ROI) when these data quality issues are found and monitored.
3. Artifacts from BI (Like reporting or data warehouse, data mart, Single view of data) will now have official role to play in overall data quality measurement and monitoring imitative

What can BI teams do proactively to address data quality issues before they snow ball into show stoppers for BI initiatives?

Process for Identifying Data Quality Issues before BI Implementations

Process for Identifying Data Quality Issues before BI Implementations

1. Prioritize functional/business areas for which BI is to be implemented, if this is already done, stick with existing priorities.
2. Distill down those priorities in to set of business questions(10-20 for each requirement to cover all bases) which BI initiatives will answer. These questions will have to be developed in partnerships with business users

For example: After implementing data mart for pipeline analysis, marketing department should be able to do competitive analysis to understand which competitor is being run into most of the time. Allow trending and analysis of this information by all aspects of the pipeline (Time, Sales Organizations, Regions, Verticals etc…)

3. Itemize data which will be required to answer/fulfill business question. For example in above scenario following data should be available.

a. Pipeline/opportunity records data with competitor names captured
b. Opportunity data where sales organization is captured on it etc….

4. Now that you are aware of the data groups required to support the business question, do profiling of those data groups: In above example, you want to look at competitor field, profile that field in light of the question and understand

a. Density: How often is competitor and win/loss field captured? (e.g. only 10%)
b. Accuracy/Validity: How many instances of competitor field are usable Vs useless? (Out of 10% from the above, 70% are usable)
c. Enrichment effort: How many instance of competitor field needs cleaning? (Out of 70% from above, we need to clean up almost 50%)
d. Recency: What is the trend (recency) of entering this information (May be sales started entering this data recently, for last two quarters you had 80% of transactions with this information, so if focus is for last two quarters, reasonable amount of information is available for answering question above)

5. Once equipped with this information build the scorecard for accuracy of the data and validity of information/insight which can be gleaned from this data in the context of question…

6. So to come down to making decision about what BI team should do next, one will need to build necessary context around accuracy of information Vs. observed data quality to provide proper idea about what business users can expect to get out of this initiative if continued as is….In above example, some of the metrics which can be defined could be

i. If all of available data is looked at for analyzing competitive information only 10% of data (at best) would have any information. So if you do not put any constraint on the time frame, you are going to get insight into only 10% worth of business at best. This will require some effort on the part of parties involved to clean the data; otherwise this percentage will go further down.
ii. If you look at only last two quarters and information going forward, best you can get visibility is about 80% (provided data quality/cleanliness issues are addressed otherwise it will be less than that) of your business.

Sample of data quality score card

Sample of data quality score card

iii. If business decides to go ahead with the option (ii) above, an agreement/plan needs to be reached to have marketing/sales department implement the process to enter competitive information on each deal (going forward) as well as BI needs to commit to implement metrics/reports/dashboard to track this as one of the indicators and provide comparison of data in percentages as well as absolute numbers (Sample shown in the data quality score card image) to show ongoing progress or lack there off.
iv. If option 2 above is not acceptable to the business users, BI initiative needs to implement metrics to track the current data quality and ongoing data quality as a first step towards continuing BI initiative. This will help in motivating marketing/sales department to make necessary investments into fixing data quality (at least going forward). This won’t totally derail the BI initiative. After implementing the quality measurement initiative, BI team can still go ahead and implement the rest of the solution and engage all parties to start working on data quality (where ever applicable). This way when data quality goals are achieved by business or better understood by business they can start using BI implementation for competitive analysis with proper context of data quality in mind (I can use all of the data if I am doing competitive analysis for Printers and it will be roughly about 60% complete across all time frames, but for other products, I am better off looking at YTD data)

Doing above exercise will help in avoiding any surprises resulting out of visibility into data quality issues during BI implementations. This will also help in positioning BI as critical and required partner in enterprise data quality initiative. With some efforts, planning (and a little bit of luck), you can both save the messenger and also will allow business to attack the message.

Read Full Post »

During tough economic times, CFOs are tightening the belt and are looking into managing expenses across organization; in order to achieve necessary cuts in the expenses they recruit help from all business units/division/areas within the organization. Managing expenses is enterprise wide tasks and CFOs (Chief Financial Officers) lead this initiative. Impact of managing expense is far reaching; it helps organization to impact profitability by reducing expenses.

I believe that data quality is an enterprise level challenge as well; data quality affects all the parties who use, generate or maintain data for the purpose of managing their business. Typically CIOs (Chief Information Officers) lead (or should lead) all initiatives associated with data quality and they should look for recruiting help from across the organizations (just as CFOs or CEOs do). Creating enterprise wide awareness and urgency about the impact of the issues associated with data quality (and hence information quality) will ensure success and proper level of sponsorship of the data quality initiatives within the organization.

While there are several ways in which impact of the data quality can be highlighted to the organization, following are some practical examples which I have found to be easy to use while communicating soft ROI or impact of data quality on individual business units.

Sales Organization:

Capturing good quality data would directly impact on helping to close additional deals or help with closing of the deals which might have been lost otherwise. While this is a broader statement, I can give few examples where this has helped me in realistic situations….

Capturing Win/Loss reasons consistently on deals sales team marks as closed (Won or Loss) would result in better analysis of reasons why organization is losing deals. Based on this information, sales and sales operations organization can always recruit help from marketing or engineering to overcome reasons behind the losing those deals. Many times once better understanding about why deals are being lost against a competitor or lost in industry vertical is well established, minor tweaks in positioning or minor product enhancements could turn the tide and will reduce loss of deals in those situations. Of course all this will be possible if good quality data about win/loss reasons is captured. In order to make sure that this data is captured on an ongoing basis, Sales Operations need to commit to investing time on monitoring capture of Win/Lost reasons. They also need to figure out if there are standardized templates they might want to use to capture this information succinctly.

Another example in similar category would be to capture the competitor information (Competitor name, name of the products being competed against) on all deals. Better analysis of the win/loss data against competitor can help with recruiting marketing/engineering help to fend off competition. Again what it takes is qualitative data about competition and products in every deal in which sales is competing.

Interesting thing about this is that one or two deals (off-course depending upon your ASP) salvaged could pay for entire incremental expense of enterprise data quality initiative for the year or so.

Marketing:

Amongst many aspects, marketing department budgets and effectiveness is decided by how many sales qualified leads are generated on periodic basis. In many instances, speed with this Marketing is able to nurture and grow leads to be sales qualified leads really depends on the quality of the information captured. These days marketing automation tools provide personalized messaging but for that messaging to work marketing really needs to gather quality information about leads. It’s relatively easy to justify effectiveness of marketing campaign or even lead nurturing process by considering data quality of the leads data. For example, a client of mine decided to market their product to clients who had ERP system installed in-house (SAP or Oracle Applications). Unfortunately, lead data which was captured through webinars and trade shows did not capture this information in a consistent format; on top of that this field had a lot of data quality issues (free form text). Investing marketing and IT time to clean this data would greatly increase the effectiveness of any campaign or lead nurturing program to be run against these leads.

Operations and Finance:

If finance or operations team is particular about capturing all the contract and customer/account details in a clean way first time around on the transactions, they can save time it takes to ship and then bill the products to customer (save time wasted on figuring out where to ship/where to bill etc…), thus saving several days on shipping and invoicing clients. This will directly result in improved cash flows (early billing = early countdown of terms = early payment), greater customer satisfaction etc…If the data on existing contracts or existing customer is not clean and if your organization is engaged into business of reselling into those accounts or on renewals, it is worthwhile to make an effort to ascertain data quality of this data and save several days it takes to go through shipping and billing process.

In summary, I believe that data quality is enterprise wide issue which impacts almost everyone who creates uses and maintains data generated through business operations. CIOs need to champion the data quality cause at enterprise level just the way CFOs or CEOs champion the cause of cost cutting across enterprise by enlisting help from each and every department. Some of the ideas given above can be used for highlighting softer ROI of data quality across enterprise. Details and specifics of the above examples will vary from organization to organization based on the business they are in and the business model they have within their organizations.

Read Full Post »

Older Posts »