Health expenditures in the United States crossed $3.0 trillion in 2013 which is more than ten times the $256 billion spent in 1980.
Almost 15% of U.S GDP is spent on healthcare…a staggering number. As a mega-vertical, healthcare covers several major segments (the 7 Ps)
- Payers (Health Insurance and Health Plans),
- Providers (Hospital Systems, Labs and IDNs),
- Pharmacy (retail distribution networks), and
- Pharmaceutical and medical equipment manufacturers,
- Prescribers (Physicians, clinics and pharmacy minute clinics)
- Police (Regulators, FDA)
- Patients (consumers)
A Healthcare system is a complex beast and difficult to navigate – providers need to make it easier for patients. They are using people resources like care coordinators and patient navigators to help patients navigate the system.
The focus on the payor side is in digitizing Health today is to reduce the amount of waste in the health care system via implementation of new forms of health IT and Analytics… that reduces inefficiencies, redundancies and administrative costs.
According the CEO of Aetna…”the health care system wastes more than $765 billion each year – that’s 30 percent of our health care spending.”
While spending on health care is dominating headlines, the health care industry (7Ps) is in a state of flux. Stakeholders across the health care sector are running hard to reduce costs. The drivers impacting cost of healthcare include:
- Aging population – Patient history and patterns of care impacting patient readmission rates
- Rise in Chronic Disease – 75% of cost – Prevention not reactive medicine
- Drug cost – escalating for certain therapies (Generics exchanged for biological drugs)
The healthcare ecosystem is being reshaped by two powerful counter economic forces at work: (1) Improve quality of care and (2) drive the cost of care down. Basically spend less and get more.
As a result, the entire healthcare ecosystem is changing into a “information-driven”, “evidence-based” and “outcome-driven” model.
The target healthcare transformation goals are:
- align economic incentives between payers and providers,
- digital engagement…create a simpler, more transparent consumer experience, and
- connected health….technologies that seamlessly connect our healthcare system.
In this posting we look at Digital Health Care use cases and how data and analytics are being slowly but sure being adopted in the form of informatics. All this change is being driven under the guise of Health Reform.
The “real meat and potatoes” use cases behind big data actual adoption might be around B2B machine data management and Industrial analytics enabled by wireless, battery-free sensor platforms.
While social, consumer, retail and mobile big data get a lot of PR, the big data business cases around industrial machine data analytics or “things that spin” actually make economic sense. These projects tend to show tangible Return on Investment (ROI).
The concept of Internet-connected machines that collect telemetry data and communicate, often called the “Internet of Things or M2M” has been marketed for several years:
– I.B.M. has its “Smarter Planet” initiative
– Cisco has its “Internet of Everything” initiative
– GE has its “Industrial Internet” initiative.
– Salesforce.com has its “Internet of Customers” theme
To compete with GE….Hitachi, United Technologies, Siemens, Bosch, Schneider Electric, Phillips and other industrial giants are all getting on the band-wagon as the vision of M2M is now viable with advances in microelectronics, wireless communications, and microfabricated (MEMS) sensing enabling platforms of rapidly diminishing size.
The Bosch Group has embarked on a series of initiatives across business units that make use of data and analytics to provide so-called intelligent customer offerings. These include intelligent fleet management, intelligent vehicle-charging infrastructures, intelligent energy management, intelligent security video analysis, and many more. To identify and develop these innovative services, Bosch created a Software Innovations group that focuses heavily on big data, analytics, and the “Internet of Things.”
Similarly, the Schneider Electric focuses primarily on energy management, including energy optimization, smart-grid management, and building automation. Its Advanced Distribution Management System, for example, handles energy distribution in utility companies. ADMS monitors and controls network devices, manages service outages, and dispatches crews. It gives utilities the ability to integrate millions of data points on network performance and lets engineers use analytics to monitor the network.
Industrial Internet – making smart use of sensors, networked machines and data analytics – is the big vision, but the business driver is in no unplanned downtime for customers.
As a data engineer and scientist, I have been following the NSA PRISM raw intelligence mining program with great interest. The engineering complexity, breadth and scale is simply amazing compared to say credit card analytics (Fair Issac) or marketing analytics firms like Acxiom.
Some background… PRISM – “Planning Tool for Resource Integration, Synchronization, and Management” – is a top-secret data-mining “connect-the-dots” program aimed at terrorism detection and other pattern extraction authorized by federal judges working under the Foreign Intelligence Surveillance Act (FISA). PRISM allows the U.S. intelligence community to look for patterns across multiple gateways across a wide range of digital data sources.
PRISM is unstructured big data aggregation framework — audio and video chats, phone call records, photographs, e-mails, documents, financial transactions and transfers, internet searches, Facebook Posts, smartphone logs and connection logs – and relevant analytics that enable analysts to extract patterns. Save and analyze all of the digital breadcrumbs people don’t even know they are creating.
The whole NSA program raises an interesting debate about “Sed quis custodiet ipsos custodes.” (“But who will watch the watchers.”) Read more
- How do I monetize my data? How do we turn data into dollars?
- What small data or big data monetization strategies should I adopt?
- Which analytical investments and strategies really increase revenue?
- What pilots should I run to test data monetization ideas out?
Data Monetization is the process of converting data (raw data or aggregate data) into something useful and valuable – help make decisions (such as predictive maintenance) based on multiple sources of insight. Data monetization creates opportunities for organizations with significant data volume to leverage untapped or under-tapped information and create new sources of revenue (e.g., cross-sell and upsell lift; or prevention of equipment breakdowns).
But, data monetization requires a new IT clock-speed that most firms are struggling with. Aberdeen Research found that the average time it takes for IT to complete BI support requests, with traditional BI software, is 8 days to add a column to a report and 30 days to build a new dashboard. For an individual information worker trying to find an answer, make a decision, or solve a problem, this is simply untenable. For an organization that is trying to differentiate itself on information innovation or data-driven decision making, it is a major barrier to strategy execution.
To speed up insight generation and decision making (all elements of data monetization) business users are bypassing IT and investing in data visualization (Tableau) or data discovery platforms (Qlikview). These platforms help users ask and answer their own stream of questions and follow their own path to insight. Unlike traditional BI that provides dashboards, heatmaps and canned reports, these tools provide a discovery platform rather than a pre-determined path.
Also companies like Marketo which create marketing automation software are getting into the customer engagement and data monetization game. Their focus is to enable marketing professionals find more future customers; to build, sustain, and grow relationships with those buyers over time; and to cope with the sheer pace and complexity of engaging with customers in real time across the web, email, social media, online and offline events, video, e-commerce storefronts, mobile devices and a variety of other channels. And in many companies, marketing knits these digital interactions together across multiple disconnected systems. The ability to interact seamlessly with customers across multiple fast-moving digital channels requires an engagement strategy enabled by data and analytic insights.
If the analytics team wrestles with getting access to data, how timely are the insights?
To address the question…Global CIO are shifting their strategy — “need to build data-as-a-service offering for my data” to enable the analytics users in the organization. The more advanced CIOs are asking – “how should I build data science capabilities as a shared foundation service?”
The CIO challenge is not trivial. Successful organizations today operate within application and data eco-systems which extend across front-to-back functions (sales & marketing all the way to fulfillment and service) and well beyond their own boundaries. They must connect digitally to their suppliers, partners, distributors, resellers, regulators and customers. Each of these have their “data fabrics” and applications which were never designed to connect, so with all the data-as-a-service and big data rhetoric, the application development community being asked to “work magic” in bringing them together.
Underutilization and the complexity of managing growing data sprawl is not new. But the urgency to address this is increasing dramatically during the last several years. Data-as-a-Service (DaaS) is seen as a big opportunity in improving IT efficiency and performance through centralization of resources. DaaS strategies have increased dramatically in the last few years with the maturation of technologies such as data virtualization, data integration, MDM, SOA, BPM and Platform-as-a-service.
The questions which are accelerating the Data-as-a-Service (DaaS) trend: How to deliver the right data to the right place at the right time? How to “virtualize” the data often trapped inside applications? How to support changing business requirements (analytics, reporting, and performance management) in spite of ever changing data volumes and complexity.
Everyone is abundantly aware of the changing landscape within the financial services industry.
Over the past five years, we’ve seen a massive regulatory overhaul and an industry-wide push
to enhance trust and confidence and encourage investor participation in the financial system.
To roadmap Wall Street priorities for 2014 and 2015, we have been having an interesting set of meetings recently with MDs and leading architects in global banks and investment services firms.
No longer business as usual. It is clear that banks are devoting more resources to Know Your Customers (KYC), Anti-Money Laundering (AML), fraud detection and prevention, Office of Foreign Assets Control (OFAC) compliance. We are at the beginning stages of the process for building the
Consolidated Audit Trail, or CAT.
To enable compliance with variety of Risk/Regulatory initiatives, AML and KYC initiatives…the big foundational investments are around:
1) Strengthening the Golden Sources – Security Master, Account Master and Customer Master.
2) Standardized, common global business processes, data, systems and quantitative solutions that can be leveraged and executed across geographies, products, and markets to manage delinquency exposures, and efficiently meet Regulatory requirements for Comprehensive Capital Analysis and Review (CCAR), FDIC Reporting, Basel, and Stress Loss Testing.
3) Various enterprise data management initiatives – Data Quality, Data Lineage, Data Lifecycle Management, Data Maturity and Enterprise Architecture procedures.
Regulatory reporting improvements via next generation Enterprise Datawarehouses (EDW) — Reporting on top of EDW addresses the core problems faced by Finance, Risk and Compliance when these functions extract their own feeds of data from the product systems through which the business is conducted and use differing platforms of associated reference data in support of their reporting processes. Lot of current investments are in the areas of Finance EDW which delivers common pool of contracts, positions and balances, organized on an enterprise wide basis and completed by anointed “gold” sources of reference data which ensure consistency and integration of information.
Crawl, walk, Run seems to be the execution game-plan as the data complexity is pretty horrendous. Take for instance, Citi alone….has approximately 200 million accounts and business in 160+ countries and jurisdictions.
The type of data challenges global banks like Citigroup and JP MorganChase are wrestling with include: Read more
Big Data emphasizes the exponential growth of data volumes worldwide (collectively, >2.5 Exabytes/ day).
Big Data incorporate the following key tenets: diversification, low latency, and ubiquity. In parallel, the emerging field of data science introduces new terms including, predictive modeling, machine learning, parallelized and in-database algorithms, Map Reduce, and data monetization.
A variety of infographics have been published around Big Data, Data Scientists. Here is a compendium of some very interesting ones.
The Real World of Big Data (Click image to see a larger version and article)
|Big Data Big Opportunity||A Data Scientist Study|
“Through 2015, more than 85 percent of
Fortune 500 organizations will fail to effectively exploit big data for competitive advantage” – Gartner BI Summit.
It doesn’t take genius to recognize that there is an increasing demand for information to improve shareholder value and gain competitive advantage by leveraging information, data and analytics as a strategic enterprise asset. The question is no longer about the importance of data but when, how, and where to leverage the asset. Read more