“More firms will adopt Amazon EC2 model for data analytics. Put in a credit card, by an hour or months worth of compute and storage data. Charge for what you use. No huge sign up period or fee. Ability to fire up complex analytic systems. Can be a small or large player” Ravi Kalakota’s forecast for 2012
Many organizations are starting to think about “analytics-as-a-service” as they struggle to cope with the problem of analyzing massive amounts of data to find patterns, extract signals from background noise and make predictions. In our discussions with CIOs and others, we are increasingly talking about leveraging the private or public cloud computing to build an analytics-as-a-service model.
Analytics-as-a-Service is an umbrella term I am using to encapsulate “Data-as-a-Service” and “Hadoop-as-a-Service” strategies. It is more sexy
The strategic goal is to harness data to drive insights and better decisions faster than competition as a core competency. Executing this goal requires developing state-of-the-art capabilities around three facets: algorithms, platform building blocks, and infrastructure.
Analytics is moving out of the IT function and into business — marketing, research and development, into strategy. As result of this shift, the focus is greater on speed-to-insight than on common or low-cost platforms. In most IT organizations it takes anywhere from 6 weeks to 6 months to procure and configure servers. Then another several months to load, configure and test software. Not very fast for a business user who needs to churn data and test hypothesis. Hence cloud-as-a-analytics alternative is gaining traction with business users.
Data overload is becoming a huge challenge for businesses and a headache for decision makers. Public and private sector corporations are drowning in data — from sales, transactions, pricing, supply chains, discounts, product, customer process, projects, RFID smart tags, tracking of shipments, as well as e-mail, Web traffic and social media.
I see this data problem getting worse. Enterprise software, Web and mobile technologies are more than doubling the quantity of business data every year, and the pace is quickening. But the data/information tsunami is also an enormous opportunity if and only if tamed by the right organization structure, processes, people and platforms.
A BI CoE (also called BI Shared Services or BI Competency Centers) is all about enabling this disciplined transformation along the information value chain: “Raw Data -> Aggregated Data -> Intelligence -> Insights -> Decisions -> Operational Impact -> Financial Outcomes -> Value creation.” A BI CoE can improve operating efficiencies by eliminating duplication and streamlining processes.
In this posting we are going to look at several aspects of executing a BI CoE:
- What does a BI CoE need to do?
- Insource or Outsourcing the BI CoE
- Why do BI CoE’s Fail?
- BI CoE Implementation Checklist