Another day, another data breach. Just received another “We’re sorry you got hacked”…letter.
This is the fifth letter I have received in the past 3 months: Forbes.com, Target, Neiman Marcus, credit card company and a previous employer. What is going on?
Why aren’t firms investing in beefing up their predictive ability to spot the cyber-security intrusion threats? What’s taking them so long to identify? Why is the attack signature – sophisticated, self-concealing malware – so difficult to spot? Do firms need to invest in NSA PRISM type threat monitoring capabilities?
The three impediments to discovering and following up on attacks are:
- Volume, velocity and variety – Not collecting appropriate security data
- Immaturity and not identifying relevent event context (event correlation)
- lack of system awareness and vulnerability awareness
Obviously… where there is pain…there is opportunity for entrepreneurs see below – data from IBM). There is a growing focus on big data use case for security analytics after all the breaches we are seeing. General Electric announced it had completed a deal to buy Wurldtech, a Vancouver-based cyber-security firm that protects big industrial sites like refineries and power plants from cyber attacks.
Here are three recent examples that I was personally affected by – Forbes, Target, Neiman Marcus.
“Google, Facebook are really big data companies, not software companies. They collect data, process it and sell it back with value added extensions. They don’t have better algorithms. They simply have more data.” — Anonymous
The convergence of cloud, social, mobile and connected computing has sparked a data revolution. More than 90 percent of the world’s data has been generated over the last two years . And with a projected 50 billion connected “things” by 2020 , the volume of data available is expected to grow exponentially. This proliferation of data has created a vast ocean of potential insights for companies, allowing them to know their customers in a whole new way.
Data is valuable. Data is plentiful. Data is complex. Data is in flux. Data is fast moving. Capturing and managing data (Cloud, On-Premise, Hybrid IT) is challenging. It’s a paradox of the information age. The glut of information that bombards us daily too frequently obscures true insight.
Help people uncover, see, understand and visualize data presents a broad and momentous market opportunity….call this user-driven discovery. Take for instance, Facebook (like Amazon.com) builds a custom Web page every time you visit. It pores over all the actions your friends have taken—their postings, photos, likes, the songs they listen to, the products they like—and determines in milliseconds which items you might wish to see, and in what order. Is this the future for every firm…..
The opportunity is simply getting bigger by the day. Every customer interaction is generating a growing trail of data (“data exhaust”). Every machine that services the customer is generating data. Every conversation, transaction, engagement, touchpoint location, offer, response is a potential digital bread-crumb of opportunity.
Now let’s flip the context. A typical mobile user check their phone interface 150 times a day for updates. A Gen Y or Millenial user obviously much more than a Gen X user. The consumption patterns for information are changing continuously. Facebook style real-time updates which were revolutionary 5 years ago seem outdated in the mobile world. We live in an “attention deficit economy” where attention is the new basis for competition. The firms that create the evolving experience using data which can grab/hold your attention will attract marketing and ad $$.
As a result, the buzz and hype around data…small data, big data, machine data, social data, mobile data, wearables data….is relentless. As a result there are a lot of new initiatives and companies. I have been asked repeatedly by a lot of entrepreneurs and strategy teams about analytics market size and opportunity size. Product and services firms are also interested in opportunity sizing as they create new offerings in the data rich world.
I thought i would share a mashup of industry and market sizing data i have collected so far.
- How big is the overall market for Analytics, Big Data?
- How big is the market for Digital Customer Interaction or Engagement?
- How big is the market for Mobile and Social Intelligence?
- How big is the market for Wearables?
- What is growing fast, faster and fastest?
All good questions as services firms think about digital strategy, analytics and future state. You always want to be in the “hot” area… selling is easier, valuations are richer, revenue growth percentages exponential.
Self-tracking, Seamless Engagement and Personal Efficiency improvement’s new frontier is Personalized Big Data and Digital Health. This is really becoming a viable idea around wearable and sensor computing and the basis for new data platform wars.
The new platforms for digital life or data driven life — that collect, aggregate and disseminate — will cover a wide range of new User Experience (UX) use cases and end-points… medical devices, sensor-enable wristwear, headset/glasses, tech-sensitive clothing. All of them are going to collect a lot of data, low latency analytics, and enable data visualization. Several new firms are entering the activity tracker market LG (Life Band Touch), Sony (the Core), Garmin (Vivofit), Glassup, Pebble, JayBird Reign etc.
Data collection is just one piece of the solution. The foundation for personalized big data is Descriptive and Predictive Analytics. Ok…What do i next? what is the suggestion? in the form of predictive search (automated deduction or augmented reality).
How do i discover useful patterns, analyze, visualize, share, query and mobilize the collected data? A wide range of start-ups – Cue, reQall, Donna, Tempo AI, MindMeld, Evernote, Osito, and Dark Sky – and big companies like Apple, Google, Microsoft, LG and Samsung are working on predictive apps — aimed at enabling new robo-assistants that act as personal valets, anticipating what you need before you ask for it.
The following eight secular disruptive themes are what Goldman Sachs believe have the potential to reshape their categories and command greater investor attention in the coming years.
The Eight Themes:
- E-cigarettes – The potential to transform the tobacco industry
- Cancer Immunotherapy – The future of cancer treatment?
- LED Lighting – A large, early-stage and multi-decade opportunity
- Alternative Capital – Rise of a new asset class means growing risk for reinsurers
- Natural Gas Engines – Attractive economics drive strong, long-term penetration
- Software Defined Networking (SDN) – Re-inventing networking for the cloud era
- 3D Printing – Disruption materializing
- Big Data – Solutions trying to keep up with explosive data growth and complexity (Industrial Big Data and Personalized Big Data)
These eight themes – through product or business innovation – Goldman claims are poised to transform addressable markets or open up entirely new ones, offering growth insulated from the broader macro environment and creating value for their stakeholders.
Goldman focuses on the impact of creative destruction – a term made famous by the Austrian economist Joseph Schumpeter, which emphasized the fact that innovation constantly drives breeding of new leaders and replacement of the old.
Health expenditures in the United States crossed $3.0 trillion in 2013 which is more than ten times the $256 billion spent in 1980.
Almost 15% of U.S GDP is spent on healthcare…a staggering number. As a mega-vertical, healthcare covers several major segments (the 7 Ps)
- Payers (Health Insurance and Health Plans),
- Providers (Hospital Systems, Labs and IDNs),
- Pharmacy (retail distribution networks), and
- Pharmaceutical and medical equipment manufacturers,
- Prescribers (Physicians, clinics and pharmacy minute clinics)
- Police (Regulators, FDA)
- Patients (consumers)
A Healthcare system is a complex beast and difficult to navigate – providers need to make it easier for patients. They are using people resources like care coordinators and patient navigators to help patients navigate the system.
The focus on the payor side is in digitizing Health today is to reduce the amount of waste in the health care system via implementation of new forms of health IT and Analytics… that reduces inefficiencies, redundancies and administrative costs.
According the CEO of Aetna…”the health care system wastes more than $765 billion each year – that’s 30 percent of our health care spending.”
While spending on health care is dominating headlines, the health care industry (7Ps) is in a state of flux. Stakeholders across the health care sector are running hard to reduce costs. The drivers impacting cost of healthcare include:
- Aging population – Patient history and patterns of care impacting patient readmission rates
- Rise in Chronic Disease – 75% of cost – Prevention not reactive medicine
- Drug cost – escalating for certain therapies (Generics exchanged for biological drugs)
The healthcare ecosystem is being reshaped by two powerful counter economic forces at work: (1) Improve quality of care and (2) drive the cost of care down. Basically spend less and get more.
As a result, the entire healthcare ecosystem is changing into a “information-driven”, “evidence-based” and “outcome-driven” model.
The target healthcare transformation goals are:
- align economic incentives between payers and providers,
- digital engagement…create a simpler, more transparent consumer experience, and
- connected health….technologies that seamlessly connect our healthcare system.
In this posting we look at Digital Health Care use cases and how data and analytics are being slowly but sure being adopted in the form of informatics. All this change is being driven under the guise of Health Reform.
The “real meat and potatoes” use cases behind big data actual adoption might be around B2B machine data management and Industrial analytics enabled by wireless, battery-free sensor platforms.
While social, consumer, retail and mobile big data get a lot of PR, the big data business cases around industrial machine data analytics or “things that spin” actually make economic sense. These projects tend to show tangible Return on Investment (ROI).
The concept of Internet-connected machines that collect telemetry data and communicate, often called the “Internet of Things or M2M” has been marketed for several years:
– I.B.M. has its “Smarter Planet” initiative
– Cisco has its “Internet of Everything” initiative
– GE has its “Industrial Internet” initiative.
– Salesforce.com has its “Internet of Customers” theme
To compete with GE….Hitachi, United Technologies, Siemens, Bosch, Schneider Electric, Phillips and other industrial giants are all getting on the band-wagon as the vision of M2M is now viable with advances in microelectronics, wireless communications, and microfabricated (MEMS) sensing enabling platforms of rapidly diminishing size.
The Bosch Group has embarked on a series of initiatives across business units that make use of data and analytics to provide so-called intelligent customer offerings. These include intelligent fleet management, intelligent vehicle-charging infrastructures, intelligent energy management, intelligent security video analysis, and many more. To identify and develop these innovative services, Bosch created a Software Innovations group that focuses heavily on big data, analytics, and the “Internet of Things.”
Similarly, the Schneider Electric focuses primarily on energy management, including energy optimization, smart-grid management, and building automation. Its Advanced Distribution Management System, for example, handles energy distribution in utility companies. ADMS monitors and controls network devices, manages service outages, and dispatches crews. It gives utilities the ability to integrate millions of data points on network performance and lets engineers use analytics to monitor the network.
Industrial Internet – making smart use of sensors, networked machines and data analytics – is the big vision, but the business driver is in no unplanned downtime for customers.
As a data engineer and scientist, I have been following the NSA PRISM raw intelligence mining program with great interest. The engineering complexity, breadth and scale is simply amazing compared to say credit card analytics (Fair Issac) or marketing analytics firms like Acxiom.
Some background… PRISM – “Planning Tool for Resource Integration, Synchronization, and Management” – is a top-secret data-mining “connect-the-dots” program aimed at terrorism detection and other pattern extraction authorized by federal judges working under the Foreign Intelligence Surveillance Act (FISA). PRISM allows the U.S. intelligence community to look for patterns across multiple gateways across a wide range of digital data sources.
PRISM is unstructured big data aggregation framework — audio and video chats, phone call records, photographs, e-mails, documents, financial transactions and transfers, internet searches, Facebook Posts, smartphone logs and connection logs – and relevant analytics that enable analysts to extract patterns. Save and analyze all of the digital breadcrumbs people don’t even know they are creating.
The whole NSA program raises an interesting debate about “Sed quis custodiet ipsos custodes.” (“But who will watch the watchers.”) Read more
- How do I monetize my data? How do we turn data into dollars?
- What small data or big data monetization strategies should I adopt?
- Which analytical investments and strategies really increase revenue?
- What pilots should I run to test data monetization ideas out?
Data Monetization is the process of converting data (raw data or aggregate data) into something useful and valuable – help make decisions (such as predictive maintenance) based on multiple sources of insight. Data monetization creates opportunities for organizations with significant data volume to leverage untapped or under-tapped information and create new sources of revenue (e.g., cross-sell and upsell lift; or prevention of equipment breakdowns).
But, data monetization requires a new IT clock-speed that most firms are struggling with. Aberdeen Research found that the average time it takes for IT to complete BI support requests, with traditional BI software, is 8 days to add a column to a report and 30 days to build a new dashboard. For an individual information worker trying to find an answer, make a decision, or solve a problem, this is simply untenable. For an organization that is trying to differentiate itself on information innovation or data-driven decision making, it is a major barrier to strategy execution.
To speed up insight generation and decision making (all elements of data monetization) business users are bypassing IT and investing in data visualization (Tableau) or data discovery platforms (Qlikview). These platforms help users ask and answer their own stream of questions and follow their own path to insight. Unlike traditional BI that provides dashboards, heatmaps and canned reports, these tools provide a discovery platform rather than a pre-determined path.
Also companies like Marketo which create marketing automation software are getting into the customer engagement and data monetization game. Their focus is to enable marketing professionals find more future customers; to build, sustain, and grow relationships with those buyers over time; and to cope with the sheer pace and complexity of engaging with customers in real time across the web, email, social media, online and offline events, video, e-commerce storefronts, mobile devices and a variety of other channels. And in many companies, marketing knits these digital interactions together across multiple disconnected systems. The ability to interact seamlessly with customers across multiple fast-moving digital channels requires an engagement strategy enabled by data and analytic insights.
If the analytics team wrestles with getting access to data, how timely are the insights?
To address the question…Global CIO are shifting their strategy — “need to build data-as-a-service offering for my data” to enable the analytics users in the organization. The more advanced CIOs are asking – “how should I build data science capabilities as a shared foundation service?”
The CIO challenge is not trivial. Successful organizations today operate within application and data eco-systems which extend across front-to-back functions (sales & marketing all the way to fulfillment and service) and well beyond their own boundaries. They must connect digitally to their suppliers, partners, distributors, resellers, regulators and customers. Each of these have their “data fabrics” and applications which were never designed to connect, so with all the data-as-a-service and big data rhetoric, the application development community being asked to “work magic” in bringing them together.
Underutilization and the complexity of managing growing data sprawl is not new. But the urgency to address this is increasing dramatically during the last several years. Data-as-a-Service (DaaS) is seen as a big opportunity in improving IT efficiency and performance through centralization of resources. DaaS strategies have increased dramatically in the last few years with the maturation of technologies such as data virtualization, data integration, MDM, SOA, BPM and Platform-as-a-service.
The questions which are accelerating the Data-as-a-Service (DaaS) trend: How to deliver the right data to the right place at the right time? How to “virtualize” the data often trapped inside applications? How to support changing business requirements (analytics, reporting, and performance management) in spite of ever changing data volumes and complexity.