Health expenditures in the United States neared $3.0 trillion in 2013 which is over ten times the $256 billion spent in 1980. Almost 15% of U.S GDP is estimated to be spent on healthcare.
Approximately 149 M people in the U.S rely upon health care that is funded by an employer, according to Kaiser Family Foundation. In 2012, the average annual cost of health coverage per employee was $10,558, compared to $4,924 in 2001 – a 106% increase in 11 years. (Source: Mercer)
The focus in Healthcare today is to reduce the amount of waste in the health care system through the implementation of new forms of health IT… that reduces inefficiencies, redundancies and administrative costs. According the CEO of Aetna…”the health care system wastes more than $765 billion each year – that’s 30 percent of our health care spending.”
As a mega-vertical, healthcare covers several major segments (the 7 Ps)
- Payers (Health Insurance and Health Plans),
- Providers (Hospital Systems, Labs and IDNs),
- Pharmacy (retail distribution networks), and
- Pharmaceutical and medical equipment manufacturers,
- Prescribers (Physicians and clinics)
- Police (regulators)
While spending on health care is dominating headlines, the health care industry (7Ps) is in a state of flux. Stakeholders across the health care sector are running hard to reduce costs. The drivers impacting cost of healthcare include:
- Aging population – 100% are aging
- Rise in Chronic Disease – 75% of cost
- Demand for technology continues
- Drug cost – better, but still bad (Generics exchanged for biological drugs)
- Waste – estimated at 30%, but depends on definition
The healthcare ecosystem is being reshaped by two powerful counter economic forces at work: (1) Improve quality of care and (2) drive the cost of care down. Basically spend less and get more. As a result, the entire healthcare ecosystem is changing into a “information-driven”, “evidence-based” and “outcome-driven” model.
The target healthcare transformation goals are:
- align economic incentives between payers and providers,
- digital engagement…create a simpler, more transparent consumer experience, and
- connected health….technologies that seamlessly connect our healthcare system.
In this posting we look at Health Care use cases and how data and analytics are being slowly but sure being adopted in the form of informatics. All this change is being driven under the guise of Health Reform.
The “real meat and potatoes” use cases behind big data actual adoption might be around B2B machine data management and Industrial analytics enabled by wireless, battery-free sensor platforms.
While social, consumer, retail and mobile big data get a lot of PR, the big data business cases around industrial machine data analytics or “things that spin” actually make economic sense. These projects tend to show tangible Return on Investment (ROI).
The concept of Internet-connected machines that collect telemetry data and communicate, often called the “Internet of Things or M2M” has been marketed for several years:
- I.B.M. has its “Smarter Planet” initiative
- Cisco has its “Internet of Everything” initiative
- GE has its “Industrial Internet” initiative.
- Salesforce.com has its “Internet of Customers” theme
To compete with GE….Hitachi, United Technologies, Siemens, Phillips and other industrial giants are all getting on the band-wagon as the vision of M2M is now viable with advances in microelectronics, wireless communications, and microfabricated (MEMS) sensing enabling platforms of rapidly diminishing size.
Industrial Internet – making smart use of sensors, networked machines and data analytics - is the big vision, but the business driver is in no unplanned downtime for customers.
As a data engineer and scientist, I have been following the NSA PRISM raw intelligence mining program with great interest. The engineering complexity, breadth and scale is simply amazing compared to say credit card analytics (Fair Issac) or marketing analytics firms like Acxiom.
Some background… PRISM - “Planning Tool for Resource Integration, Synchronization, and Management” - is a top-secret data-mining “connect-the-dots” program aimed at terrorism detection and other pattern extraction authorized by federal judges working under the Foreign Intelligence Surveillance Act (FISA). PRISM allows the U.S. intelligence community to look for patterns across multiple gateways across a wide range of digital data sources.
PRISM is unstructured big data aggregation framework — audio and video chats, phone call records, photographs, e-mails, documents, financial transactions and transfers, internet searches, Facebook Posts, smartphone logs and connection logs – and relevant analytics that enable analysts to extract patterns. Save and analyze all of the digital breadcrumbs people don’t even know they are creating.
The whole NSA program raises an interesting debate about “Sed quis custodiet ipsos custodes.” (“But who will watch the watchers.”) Read more
At the Analytics Executive Forum, I facilitated a session on Omni-channel analytics. It struck me how every leading consumer facing firm seems convinced that mobile is becoming the dominant B2C interaction channel. Mobile is the gateway to insight based marketing and the “always addressable customer”….
Insight-based interactions – The company knows who you are, what you prefer, and communicates with relevant, timely messages, using the power of analytical intelligence to detect patterns, decode strands of information and create meaningful offers and value.
The “always addressable customer.” This is a consumer who fits the bill on three fronts simultaneously: (1)
- Owns and personally uses at least three connected devices; (2)
Goes online multiple times throughout the day; (3)
- Goes online from at least three different physical locations
The opposite of insight-based is “spray-and-pray” marketing - The company has very limited knowledge about who you are, forgets what you prefer, and tries to reach you with off-target communications that alienate you – based on fragmented data, poor data quality and inadequate integration, resulting in confusing, chaotic interactions. A good example: “I have 2 million frequent flyer miles with your airline and still do not get any recognition, respect or value from this loyalty.”
As companies architect new insight based mobile use cases I suggest that they look at what is coming next. With IOS 7, Apple is delivering several new features – Passbook, Beacon.
Retailers, banks and other customer facing firms/brands better pay attention. 100+ million iPhones are automatically getting this feature with the new OS upgrade making this a mega-disruptor in the coveted target segment everyone is chasing. Read more
Machine data or “data exhaust” analysis is one of the fastest growing segments of “big data”–generated by websites, applications, servers, networks, mobile devices and other sources. The goal is to aggregate, parse and visualize this data – log files, scripts, messages, alerts, changes, IT configurations, tickets, user profiles etc – to spot trends and act.
By monitoring and analyzing data from customer clickstreams, transactions, log files to network activity and call records–and more, there is new breed of startups that are racing to convert “invisible” machine data into useful performance insights. The label for this type of analytics – operational or application performance intelligence.
In this posting we cover a low profile big data company, Splunk. Splunk has >3500 customers already. Splunk ended its first day on the stock market with amazing 108.7 percent bump in price from its $17-per-share IPO. Splunk’s potential comes from its presence in the growing cloud-analytics space. With companies gathering incredible amounts of data, they need help making sense of it and using it to optimize their business efficiency, and Splunk’s services give users the opportunity to get more from the information they gather.
The hard truth is, most advertising and marketing is white noise. Consumers have learned to tune it out. The “same old, same old” is just that — the same, and old. For brands and campaigns to be effective, they must change the conversation.
Data-driven DNA is about having the right toolset, mindset, skillset and dataset to evolve a major brand and seize today’s omni-channel opportunities. Whether it’s retooling and retraining for the multiscreen attention economy, or introducing digital innovations that transform both retail and healthcare, P&G is bringing data into every part of its core strategies to fight for the customer.
Striving for market leadership in consumer products is a non-stop managerial quest. In the struggle for survival, the fittest win out at the expense of their rivals because they succeed in adapting themselves best to their environment.
CMOs and CIOs everywhere agree that analytics is essential to sales & marketing and that its primary purpose is to gain access to customer insight and intelligence along the market funnel – awareness, consideration, preference, purchase and loyalty.
In this posting we illustrate a best-in-class “run-the-business” Analytics Case Study at P&G. The case study demonstrates four key characteristics:
- A shared belief that data is a core asset that can be used to enhance operations, customer service, marketing and strategy
- More effective leverage of more data for faster results
Technology is only a tool, it is not the answer..!
- Support for analytics by senior managers who embrace new ideas and are willing to shift power and resources to those who make data-driven decisions
This case study of a novel construct called Business Cockpit (also called LaunchTower in the Pharma Industry) illustrates the way Business Analytics is becoming more central in retail and CPG decision making.
Here is a quick summary of P&G Analytics program:
- Primary focus on improving management decisions at scale
- “Information and Decision Solutions” (IT) embeds over 300 analysts in leadership teams
- Over 50 “Business Suites” for executive information viewing and decision-making
- “Decision cockpits” on 50K desktops
- 35% of marketing budget on digital
- Real-time social media sentiment analysis for “Consumer Pulse”
P&G Overview“Data modeling, simulation, and other digital tools are reshaping how we innovate.” Bob McDonald, ex-CEO, Procter & Gamble. Digital strategies tend to play a vital role in defining the brand and connecting it with customers across the globe. P&G’s has 127,000 employees and 300 brands sold in 180 countries. P&G averages about 4 billion transactions daily. P&G CEO staked out a strategy to “digitize” the company’s processes from end to end, and Business Sufficiency, Business Sphere and Decision Cockpits is enabler of that agenda.
Data is an asset, treat it as such… P&G is building deeper analytics expertise at a time when P&G is cutting costs in other areas, including eliminating 1,600 non-manufacturing jobs. The company’s IT organization itself has cut $900 million in total spending over the past nine years. Continually evolving IT structure and culture to meet harder and harder targets.
P&G is investing in analytics talent, even as the company cuts in other areas, to speed up business decision making. True leaders develop the capabilities required for making good and timely decisions in unpredictable and stressful environments.
Can you predict what customers want before they do? Can you formulate the “next best action”?
Can offers be better targeted or timed? How to improve customer acquisition and conversion?
Growing the customer relationship is the perpetual challenge of all companies. To chagne status quo, EBay bought Hunch to help improve its recommendation services. EBay said it will use Hunch’s “taste graph” technology to provide its users with non-obvious recommendations for items based on their unique tastes.
EBay with its $2.4 Bln GSI Commerce acquisition is becoming an alternative e-commerce platform for retailers (rivaling Amazon.com). E-bay aims to apply Hunch’s technology to other areas such as search, advertising and marketing, in order to better surface product information based on its customers’ tastes.
Data Driven Retailing
Recommendation and decision engines, an area of predictive analytics and decision management, is quite active right now in the digital arena. The early online pioneer was Amazon.com which used collaborative filtering to generate “you might also want” or “next best offers” prompts for each product bought or page visited.
A typical targeted offer analytics model is shown in the figure (source: blog.strands.com). Next best action, next best offer, interaction optimization, and experience optimization all share similar structure.
The premise of data driven retailing is simple:
- Acquire the right customers
- Offer the right products
- Personalize relevant offers
- Focus on the Right timing & Channels
To understand the impact that recommendation engines can have on sales, let’s look at a traditional brick-and-mortar firm doing direct to home face-to-face selling…Schwan Food.
Schwan Food – The Business Problem:
The Schwan Food Company is a multibillion-dollar, privately owned company with 17,000 employees in the United States. Based in Marshall, Minnesota, Schwan sells frozen foods from home-delivery trucks, in grocery-store freezers, by mail and to the food service industry. Schwan produces, markets, and distributes products developed under brands such as Schwan’s, Red Baron, Freschetta, Tony’s, Mrs. Smith’s,Edwards, Pagoda Express and many others.
Schwan’s Home Service, the company’s flagship business unit, is the largest direct-to-home food delivery provider in the United States. Sales are done door-to-door by 6,000 roving sales people who deliver frozen products to homes of three million customers across the country.
Schwan home sales were listless for four straight years, beset by high customer churn and inventory pileups. So the challenge was: How to spark sales? How to get an uplift of 3-4%?
At the point of customer contact…Schwan wanted to personalize the experience. The goal is to dig deep into customer data, generate insights and engage customers in innovative ways.
What are primary drivers of sales? Schwan realized that by recommending to the customer, products that fit their profile, purchase history and interests there is a higher revenue potential for cross-sell and up-sell.
The challenge was to overhaul the current crude recommendation program that existed. Most firms like Schwan provide to the sales team data from the SAP back-end. Most of this data is stale and not dynamic. For instance, sales people could look at six weeks of orders, and suggest purchases from that list.
For an interesting background on recommendation engines see: http://en.wikipedia.org/wiki/Recommender_system
To completely overhaul the recommendation engine. Schwan began an analytics project with the aid of Opera Solutions Inc. of New York, an eight-year-old analytics firm.
The analytics project took it into more sophisticated territory: Matching seemingly disparate customers with similar purchase patterns in their past. Opera calls them finding “genetic twins.” It also added ways to track whether customers’ spending was fading from certain categories—say, breakfast foods—and offered product suggestions and discounts to keep the spending intact.
How does this work? At the core of a recommendation engine is predictive modeling. This identifies and mathematically represents underlying relationships in historical data in order to explain the data and make predictions or classifications about future events.
Predictive models analyze current and historical data on individuals to produce easily understood metrics such as scores. These scores rank-order individuals by likely future behavior, e.g., their likelihood of responding to a particular offer.
Schwan’s database is now pushing out more than 1.2 million dynamically-generated customer recommendations every day, sent directly to drivers’ handheld devices.
Opera says Schwan’s revenues are up 3% to 4% because of it.
It would be interesting to see the correlation between Schwan’s customer satisfaction scores and shopping basket mix with recommendations versus non-recommendations.
E-mail Based Recommendations
In multichannel customer-facing business processes, marketers must continually and automatically optimize all offers and customer interactions through all channels, business processes,and touchpoints such as sales, marketing, and customer service. E-mail based recommendation models are pretty advanced.
The same push based recommendation model can be leveraged via e-mail (in addition to mobile handheld direct sales). Williams-Sonoma, all things kitchen and cooking, has a database of 60M households tracking variables like income, number of children, housing values, etc. They leverage these variables in e-mail targeting programs.
Offers embedded in e-mail are tailored to the recipient at the moment they’re opened. In less than 250 milliseconds, analytics software can assemble an offer based on real-time information: data including location, age, gender, and online activity both historical and immediately preceding, along with inventory data. These offers have lifted conversion rates by as much as 30%—dramatically more than similar but uncustomized ad campaigns.
“Next Best Offer” – Online Recommendation Examples
The Netflix movie recommendation contest (blending of different statistical and machine-learning techniques) has been widely followed because its crowdsourcing lessons could extend beyond improving movie picks. The outcome: CineMatch recommendation solution built around a huge data set — 100+ million movie ratings — and the challenges of large-scale predictive modeling.
Netflix’s overview of the competition:
We’re quite curious, really. To the tune of one million dollars.
Netflix is all about connecting people to the movies they love. To help customers find those movies, we’ve developed our world-class movie recommendation system: CinematchSM. Its job is to predict whether someone will enjoy a movie based on how much they liked or disliked other movies. We use those predictions to make personal movie recommendations based on each customer’s unique tastes. And while Cinematch is doing pretty well, it can always be made better.
Now there are a lot of interesting alternative approaches to how Cinematch works that we haven’t tried. Some are described in the literature, some aren’t. We’re curious whether any of these can beat Cinematch by making better predictions. Because, frankly, if there is a much better approach it could make a big difference to our customers and our business.
So, we thought we’d make a contest out of finding the answer. It’s “easy” really. We provide you with a lot of anonymous rating data, and a prediction accuracy bar that is 10% better than what Cinematch can do on the same training data set. (Accuracy is a measurement of how closely predicted ratings of movies match subsequent actual ratings.) If you develop a system that we judge most beats that bar on the qualifying test set we provide, you get serious money and the bragging rights. But (and you knew there would be a catch, right?) only if you share your method with us and describe to the world how you did it and why it works.
Serious money demands a serious bar. We suspect the 10% improvement is pretty tough, but we also think there is a good chance it can be achieved. It may take months; it might take years. So to keep things interesting, in addition to the Grand Prize, we’re also offering a $50,000 Progress Prize each year the contest runs. It goes to the team whose system we judge shows the most improvement over the previous year’s best accuracy bar on the same qualifying test set. No improvement, no prize. And like the Grand Prize, to win you’ll need to share your method with us and describe it for the world.
Netflix announcement of winner:
It is our great honor to announce the $1M Grand Prize winner of the Netflix Prize contest as teamBellKor’s Pragmatic Chaos for their verified submission on July 26, 2009 at 18:18:28 UTC, achieving the winning RMSE of 0.8567 on the test subset. This represents a 10.06% improvement over Cinematch’s score on the test subset at the start of the contest.
Interestingly several people think that “what your friends thought” feature to be extremely accurate in predicting and suggesting movies…more than the recommendation feature.
Netflix announced a second recommendation contest that was later discontinued. Contestants were asked to model individuals’ “taste profiles,” leveraging demographic and behavioral data. The data set — 100 million entries will include information about renters’ ages, gender, ZIP codes, genre ratings and previously chosen movies. Unlike the first challenge, the contest will have no specific accuracy target. $500,000 will be awarded to the team in the lead after six months, and $500,000 to the leader after 18 months. This contest was cancelled in May 2010 after a legal challenge that it breached customer privacy with the first contest.
Building on Netflix model, California physicians group Heritage Provider Network Inc. is offering $3 million to any person or firm who develops the best model to predict how many days a patient is likely to spend in the hospital in a year’s time. Contestants will receive “anonymized” insurance-claims data to create their models. The goal is to reduce the number of hospital visits, by identifying patients who could benefit from services such as home nurse visits.
I expect to see a lot more activity around Predictive Recommendations as mobile technology makes it easier to influence buyers or convert prospects into customers. Also technology like Hadoop makes it easier to build predictive insights that can be leveraged in real-time.
Targeting customers with perfectly customized recommendations at the right moment across the right channel is sales and marketing’s holy grail. As the ability to capture and analyze highly granular data improves, such recommendations are possible.
Perfecting these “next best product recommendation” models involves four steps: defining sales and marketing objectives; gathering detailed primary or secondary data about your customers, your products, and the contextual prompts that influence customers to buy; and using data analytics and business rules to devise and execute offers.
As the amount of data that can be captured grows and the number of channels for interaction proliferates, companies that are not providing recommendations to influence buyers will only fall further behind.
Notes (and Interesting Factoids)
- A recommendation engine generates tailored, and context-sensitive recommendations to guide decisions and actions taken by humans, automated systems, or a combination thereof. For Recommendation Engines background: http://en.wikipedia.org/wiki/Recommender_system
- In the late 1990s, predictive recommendations were created by Amazon and other online companies that developed “people who bought this also bought that” offers based on relatively simple cross-purchase correlations; they didn’t depend on substantial knowledge of the customer or product attributes.
- See of Opera Solutions work at Schwan’s: Dennis Berman’s article in the Wall Street Journal, “So, What’s Your Algorithm?”
- Additional Insights that can improve Sales Effectiveness
• What are the characteristics of my most loyal customers? Least loyal?
• How do customers feel about our company and products?
• Which items drive sales? Which items are frequently purchased together?
• If I discount an item by X, what impact will it have on sales and revenue?
• How do my internet sales compare to brick and mortar in terms of revenue and cost?
• Which prospects should I target to convert into loyal customers? What products or offers would be most effective?
• Will my inventory levels meet sales forecast? When will we run out of stock?
“More firms will adopt Amazon EC2 or EMR or Google App Engine platforms for data analytics. Put in a credit card, by an hour or months worth of compute and storage data. Charge for what you use. No sign up period or fee. Ability to fire up complex analytic systems. Can be a small or large player” Ravi Kalakota’s forecast
Big data Analytics = Technologies and techniques for working productively with data, at any scale.
Analytics-as-a-Service is cloud based… Elastic and highly scalable, No upfront capital expense. Only pay for what you use, Available on-demand
The combination of the two is the emerging new trend. Why? Many organizations are starting to think about “analytics-as-a-service” as they struggle to cope with the problem of analyzing massive amounts of data to find patterns, extract signals from background noise and make predictions. In our discussions with CIOs and others, we are increasingly talking about leveraging the private or public cloud computing to build an analytics-as-a-service model.
Analytics-as-a-Service is an umbrella term I am using to encapsulate “Data-as-a-Service” and “Hadoop-as-a-Service” strategies. It is more sexy :-)
The strategic goal is to harness data to drive insights and better decisions faster than competition as a core competency. Executing this goal requires developing state-of-the-art capabilities around three facets: algorithms, platform building blocks, and infrastructure.
Analytics is moving out of the IT function and into business — marketing, research and development, into strategy. As result of this shift, the focus is greater on speed-to-insight than on common or low-cost platforms. In most IT organizations it takes anywhere from 6 weeks to 6 months to procure and configure servers. Then another several months to load, configure and test software. Not very fast for a business user who needs to churn data and test hypothesis. Hence cloud-as-a-analytics alternative is gaining traction with business users.
Went to an interesting talk by Ed Brandman, CIO of KKR & Co, the legendary Private Equity firm, hosted by CIO Perspectives in New York city. KKR is using a custom Business Intelligence solution, called Portfolio Central, to track and manage their portfolio of 62 companies. This portfolio includes some well-known companies like First Data, Toys R Us, Sungard, Dollar General, HCA and others.