Enterprise Business Intelligence (BI) project failure can happen for a variety of reasons. Sometimes it’s due to frequent scope changes with a fixed schedule constraint, unexpected and unplanned-for “must-have” requirements changes, loss of key team members onshore or offshore, chronic effort under-estimation, lack of proper work breakdown structure, lack of QA, and so on.
Regardless of the causes, BI, Analytics, performance management failed projects waste billions of dollars (and hours) each year.
Over the years, I have seen a lot of well-intentioned custom development, commercial, off-the-shelf package customization – SAP, Oracle, Peoplesoft ERP, CRM, SCM – and other enterprise data-warehouse projects get into trouble for a variety of reasons. Troubled projects usually need triage, recovery, and turn-around skills to straighten things out quickly.
I am afraid that BI and Corporate Performance Management is reaching a phase in its hype cycle where we are beginning to see growing demand for troubled project recovery. It doesn’t take genius to realize that BI/Analytics project demand is growing as it is one of few remaining IT initiatives that can make companies more competitive. However, demand doesn’t imply project success. Read more
Obsolete KPIs can be Lethal
In the Aesopian fable of the one-eyed stag, a deer overcomes his visual handicap by grazing on a cliff near the sea with his good eye facing the land. Since all his known dangers were on land, this keeps him safe from predators for a very long time – until he is killed by a hunter in a boat.
The relevance of our KPIs can make or break our business. KPIs are often defined as static metrics for an enterprise and can easily become outdated. Economic uncertainty and competitive pressures are prompting questions on the validity of KPIs and performance management processes. To stay competitive requires a process of continually validating metrics with the business environment.
Another common challlenge with KPIs is that there are too many of them. Modern technology has gven us the ability to measure a very large number of parameters in the business. Some of these are more relevant than others. Jack Welch is known to have said, ”Too often we measure everything and understand nothing”. Monitoring some metrics and ignoring others are decisions we make based on our business perspective.
Relevance Enabled by Process
How do you decide on which KPI’s are most relevant to success? An often overlloked first step is to understand that primary business goals before looking at the technology solution. Avinash Kaushik defines KPIs simply as “Measures that help you understand how you are doing against your objectives”. This fundamental aproach is a good way of weeding out items which are not relevant to what we want as a business and avoid adverse surprises. At a more deeper level, building a robust Business Analytics solution requires answers to questions such as:
1. What events have the greatest impact on the busiens and how are they measured?
2. How often do you validate that you are measuring the right parameters ?
3. What instrumentation do you need to create the right dashbords for your KPI’s ? Can this instrumentation be updatd as teh KPIs change?
4. What is the process for collecting, synthesizing, manipulating and presenting the data to represent thsese metrics? How does the process change when if the metric change?
5. What technologies and architecture are necessary to support those decision-making patterns? Is there need for a “single source of truth” or a federated model possible?
Centers of Excellence
Needless to say, this approach requires a tight inegration between the business owners and IT acrchitects. A recent study by Gartner says that ”IT collaboration initiatives fail because IT leaders hold mistaken assumptions about basic issues…..rather than making technology the starting point, IT leaders should first identify real business problems and key performance indicators (KPIs) that link to business goals.”
Many business executives believe that IT is unable to deliver results where it counts. At the same time, IT organizations spend an incredible amount of time, money and resources simply reporting obvious data within their business process and workflows.
An organizational solution to this problem is the creation of a Competency Center or Center of Excellence (CoE) with representation from from both business and IT and shared objectives. The CoE defines the blueprint for implementing BI, Performance Management and Analytics aligend with KPIs. Some of the obvious benefits include:
- Cost savings from eliminating Silos
- Better collaboration between Business and IT
- Joint ownership of corporate objectives
There are other aspects of the CoE which make it a practical approach to creating an effective vehicle for deploying analytics solutions. The sheer volume and texture of busienss data is much more complicated than it has ever been in modern busienss history. The world’s data doubles every two years creating more opportunities for analyses. Understanding this data even at an aggregate level requires a business perspective combined with technological expertise. Furthernore, understanding technologies such as Big Data for unstrcutured data analysis requires business leaders and IT eimplementors to work together.
The CoE is the ideal structire to implement a Business Perspective Solution. A well implemented Business Perspective Solution takes into account the key objectives of the busienss, leverages sophisticated analytics technologies and focuses on sustainable processes to support decision making in an organization.
Superior decisions based on business perspective separate winners from losers.
Are your KPIs in sync with your business perspectives? Please share your comments below.
1. Six Web Metrics / Key Performance Indicators To Die For by Avinash Kaushik, Occam’s Razor
2. Practical BI – What CEOs want from BI and Analytics by Ravi Kalakota, Business Analytics 3.0
3. The Stupidity of KPIs in Business Analytics by Mark Smith, Ventana Reasearch
“More firms will adopt Amazon EC2 model for data analytics. Put in a credit card, by an hour or months worth of compute and storage data. Charge for what you use. No huge sign up period or fee. Ability to fire up complex analytic systems. Can be a small or large player” Ravi Kalakota’s forecast for 2012
Many organizations are starting to think about “analytics-as-a-service” as they struggle to cope with the problem of analyzing massive amounts of data to find patterns, extract signals from background noise and make predictions. In our discussions with CIOs and others, we are increasingly talking about leveraging the private or public cloud computing to build an analytics-as-a-service model.
Analytics-as-a-Service is an umbrella term I am using to encapsulate “Data-as-a-Service” and “Hadoop-as-a-Service” strategies. It is more sexy
The strategic goal is to harness data to drive insights and better decisions faster than competition as a core competency. Executing this goal requires developing state-of-the-art capabilities around three facets: algorithms, platform building blocks, and infrastructure.
Analytics is moving out of the IT function and into business — marketing, research and development, into strategy. As result of this shift, the focus is greater on speed-to-insight than on common or low-cost platforms. In most IT organizations it takes anywhere from 6 weeks to 6 months to procure and configure servers. Then another several months to load, configure and test software. Not very fast for a business user who needs to churn data and test hypothesis. Hence cloud-as-a-analytics alternative is gaining traction with business users.
The change in consumer behavior and expectations that e-commerce, mobile and social media are causing is hugely significant – big data and predictive analytics will separate brand/retail winners from losers. This won’t happen overnight but the transformation is for real.
Retail Industry makes up a sizable part of the world economy (6-7%) and covers a large ecosystem - E-commerce, Apparel, Department Stores, Discount Drugstores, Discount Retailers, Electronics, Home Improvement, Specialty Grocery, Specialty Retailers and Consumer Product Goods suppliers.
Retail is increasingly is looking like a barbell – a brand oriented cluster at the high-end, a very thin middle, and a price sensitive cluster at the low end. The consumerization of technology is putting more downward pricing pressure in an already competitive “middle” retail environment. The squeeze is coming from e-commerce and new “point, scan and analyze” technologies that give shoppers decision making tools — powerful pricing, promotion and product information, often in real-time. Applications in iPhones and Droid, like Red Laser can scan barcodes and provide immediate price, product and cross-retailer comparisons. They can even point you to the nearest retailer who can give you free shipping (total cost of purchase optimization). This will lead to further margin erosion for retailers that compete based on price (a sizable chunk of the market in the U.S, Europe and Asia).
Data analytics is not new for retailers. Point of sale transactional data obtained from bar-codes first appeared in 1970s. A pack of Wrigley’s chewing gum was the first item scanned using Universal Product Code (UPC) in a Marsh Supermarket in Troy, Ohio in 1974. Since then, retailers have been applying analytics to get even smarter.
More recent use cases of retail analytics include: Read more
Who doesn’t want to achieve faster “time-to-information” and shorter “time-to-decision” for executives and managers with mobile BI? Who doesn’t want to disseminate insights or KPIs to front-line employees, such as field sales representatives, line of business managers, and field service employees?
The question is not whether Mobile BI is a good idea but how to execute this program in a low-cost way? How to design and deploy eye-popping “wow” apps? How to support, maintain and enhance these apps which are constantly changing? What technology and infrastructure to put in for a national or global deployment? Who is going to fund all this plumbing – corporate, LoB or IT?
Business Analytics solutions for “always-on” 3/4G enabled mobile devices – iPads, iPhones, tablets, smart phones – are becoming prevalent as the form factor becomes appropriate for BI. We are increasingly seeing firms build state-of-the-art dashboard solutions for iPads. The “post-desktop” apps provide senior management with intuitive interactive access to the company’s most important business KPIs and dealing with data overload.
Tablets, 4G Wireless and next gen displays (+gesture based, verbal interfaces) have enabled new productivity improvements and better ways to consume information, perform ad-hoc querying and scenario planning. Dashboard, heatmaps and scorecards on the iPad, iPhones and Androids are intuitive, attractive, powerful, available at any time and any place: a perfect mix for top managers, sales teams and even customers.
BI (and Information Management) is a natural fit for mobile devices. Managers, blue and white workers spend a majority of their time away from their desks. Most are traveling, walking about or driving from site to site. And it’s these mobile workers who need the most up-to-date information. They need mobile BI to retrieve data to make on-the-spot decisions, monitor operational processes and review KPI, and work-in-process dashboards.
Data overload is becoming a huge challenge for businesses and a headache for decision makers. Public and private sector corporations are drowning in data — from sales, transactions, pricing, supply chains, discounts, product, customer process, projects, RFID smart tags, tracking of shipments, as well as e-mail, Web traffic and social media.
I see this data problem getting worse. Enterprise software, Web and mobile technologies are more than doubling the quantity of business data every year, and the pace is quickening. But the data/information tsunami is also an enormous opportunity if and only if tamed by the right organization structure, processes, people and platforms.
A BI CoE (also called BI Shared Services or BI Competency Centers) is all about enabling this disciplined transformation along the information value chain: “Raw Data -> Aggregated Data -> Intelligence -> Insights -> Decisions -> Operational Impact -> Financial Outcomes -> Value creation.” A BI CoE can improve operating efficiencies by eliminating duplication and streamlining processes.
In this posting we are going to look at several aspects of executing a BI CoE:
- What does a BI CoE need to do?
- Insource or Outsourcing the BI CoE
- Why do BI CoE’s Fail?
- BI CoE Implementation Checklist
Competency Centers, Centers of excellence (CoE) or Shared Services models are structures to enable the corporate or strategic vision to create an enterprise that uses data and analytics for business value.
BI CoE is an organizing mechanism to align People, Process, Technology and Culture. The target benefits include:
- Better collaboration between Business and IT
- Increased adoption and use of BI and Analytics in the lines of business.
- Better data management, quality and reporting
- Cost savings from eliminating redundant functions
The goal for every World-class BI CoE — enable the right combination of toolset, dataset, skillset and mindset for better, faster, cheaper and more repeatable analytics?
The “Raw Data -> Aggregated Data -> Intelligence -> Insights -> Decisions” is a differentiating causal chain in business today. To service this “data->decision” chain a very large industry is emerging.
The Business Intelligence, Performance Management and Data Analytics is a large confusing software category with multiple sub-categories — mega-vendors (full stack, niche vendors, data discovery, visualization, data appliances, Open Source, Cloud – SaaS, Data Integration, Data Quality, Mobile BI, Services and Custom Analytics).
But the interest in BI and analytics is surging. Arnab Gupta, CEO of Opera states why analytics are taking center stage, “We live in a world where computers, not people, are in the driver’s seat. In banking, virtually 100% of the credit decisions are made by machines. In marketing, advanced algorithms determine messages, sales channels, and products for each consumer. Online, more and more volume is spurred by sophisticated recommender engines. At Amazon.com, 40% of business comes from its “other people like you bought…” program.” (Businessweek, September 29, 2009).
Here is a list of vendors who participate in this marketspace:
However, it took until 1980s when decision support systems (DSS) became popular and mid 1990s for BI started to emerge as an umbrella term to cover software-enabled innovations in performance management, planning, reporting, querying, analytics, online analytical processing, integration with operational systems, predictive analytics and related areas.
BI, Analytics [and Big Data] Market Sizing
“There are many methods for predicting the future. For example, you can read horoscopes, tea leaves, tarot cards, or crystal balls. Collectively, these methods are known as “nutty methods.” Or you can put well-researched facts into sophisticated computer models, more commonly referred to as “a complete waste of time.”
Scott Adams, The Dilbert Future
Are you clear on your objective? What is the most important value proposition that you want to achieve through BI and analytics enabled strategies?
- Reduction in operating expenses
- Increased profitability
- Improve growth, competitiveness and market position
- Customer acquisition, loyalty and retention
- Product development and differentiation
The mis-alignment between what C-suite wants and what IT is capable of delivering is quite extraordinary. Many CFOs, CEOs believe that IT is unable to deliver results where it counts: the top line and bottom line. At the same time, IT organizations spend an incredible amount of time, money and resources simply reporting the obvious data within their business processes and workflows. The data overload is making find the obvious in the increasing tidal wave of structured and unstructured data a full-time job. As organizations emerge from the deep recession of 2008, the competitive pressures are putting even greater demands on the decision-making, KPIs and performance management processes of organizations.
To stay competitive means making better decisions more quickly. It means accelerating the “raw data -> clean data -> information -> insight -> decision cycle.” It dictates widening the scope and scale of the data management domain, the analytic landscape and the technological infrastructure.