Blog Feed Post

5 Steps to Building a Big Data Business Strategy

“The problem is that, in many cases, big data is not used well. Companies are better at collecting data – about their customers, about their products, about competitors – than analyzing that data and designing strategy around it.”  “Companies Love Big Data but Lack the Strategy to Use It Effectively,” Harvard Business Review

How can this still be the case? I mean after 5+ years of experience with Big Data, have we not learned a darn thing? We get the following observation from no less than the Harvard Business School:

The new attention being given to data today is because suddenly, everywhere, it’s become much cheaper to measure,” says John A. Deighton, the Baker Foundation Professor of Business Administration at Harvard Business School. “Used well, it changes the basis of competition in industry after industry.”

Used well, it [big data] changes the basis of competition in industry after industry!” What more does one need to say? In spite of its game-changing opportunity, organizations still have not gotten Big Data right.

This is consistent with what I observe on my many travels and conversations across a multitude of different organizations. I would estimate that less than 2% of these organizations really know what they are doing to exploit the potential value of big data to power their business models.

I think the biggest problem is the focus on creating a big data strategy is as soon it’s completed, it’s outdated. New unknown-potential data sources emerge, new hardware innovations drive new capabilities, new data management tools and techniques evolve, new open source advanced analytic tools pop out of universities, new edge analytic architectures become scalable, etc.

The growing wealth of “monetizable” data (social media, mobile, IOT, wearables, images, photos, video) and the absolutely astounding availability of advanced analytic tools (many of them open source such as MADlib, Mahout, H2O, OpenAI and Google’s powerful Tensorflow) are obliterating your big data strategy before your expensive consultants can even get the leather binding in place.

Too many organizations are making Big Data an IT project instead of making big data a strategic business initiative that exploits the power of data and analytics to power the organization’s business models.

Figure 1: Change Your Big Data Focus to Change Your Big Data Results

Figure 1: Change Your Big Data Focus to Change Your Big Data Results


Build a business strategy that incorporates big data. Build a business strategy that uncovers detailed customer, product, service and operational insights that can be the foundation for optimizing key operational processes, mitigating compliance and cyber-security risks, uncover new revenue opportunities and create a more compelling, more differentiated customer or partner experience.

Build a business strategy that exploits the power of data and analytics to exploit changes in market demands, customer expectations, competitive moves, commodity prices, student debt, stagnating salaries, the underemployed, political trends, decline of the manufacturing middle class, fashion trends, Chicago Cubs winning the World Series after 108 years…yes, and many of these changes are sudden and unpredictable.

So how does one build this business strategy that exploits the power of big data?

Here is our 5-step approach:

Step 1: Start with the Business Initiatives

How can you transform the business if you don’t understand what’s important to the business? How can you transform the business if you don’t intimately understand what the business is trying to accomplish, why, and the desired business outcomes? Understanding the organization’s key business initiatives is the key to identifying the supporting decisions (use cases), analytics, data, and underlying big data architecture and technology requirements.

Invest the time upfront to envision how the growing bounty of internal and external data coupled with advanced analytics might impact the organization’s most important business initiatives. Brainstorm with the key business stakeholders the decisions that they are trying to make and envision how predictive analytics, prescriptive analytics and ultimately cognitive analytics can help the organization to accelerate, optimize and continuously learn from those decisions.

Step 1 requires intimate engagement between the Business and IT stakeholders. This is not something that IT does alone and then “presents” the results to the business stakeholders in some monthly “alignment” meeting. If the business stakeholders are not leading this effort, then the effort is doomed. Welcome to the 98% who just don’t get it.

Step 2: Identify and Validate Supporting Use Cases

Step 2 involves taking the decisions captured in Step 1 and clustering or grouping the decisions around common subject areas. These clusters become the key business “use cases” that support the organization’s key business initiative (see Figure 2).

Figure 2: Capture and Validate Top Priority Use Cases

Figure 2: Capture and Validate Top Priority Use Cases


The use case documentation should capture both the sources of business value as well as the potential implementation risks (“eyes wide open”). Tie the use case back to the organization’s key financial goals and assess the impact of the use case on each of financial goal. Estimate the financial impact and Return on Investment (ROI) if the use case is successfully executed over the next 12 months. Focus on the “4 M’s of Big Data”: Make Me More Money!

Step 3: Prioritize Use Cases

Step 3 may be the most difficult step because it requires organizations to do two things that they don’t like to do: prioritize and focus. Prioritizing and focusing are not popular concepts for many organizations because of political and organization pressures. But “peanut buttering” key resources and organizational commitments across a multitude of use cases is the best way to guarantee that no use case gets successfully executed.

If (and that’s a big IF) you can convince the organization to build out their big data business strategy one use case at a time, then that enables the organization to become expert at harvesting the organization’s data and analytic digital assets (and customer, product, service, operational and market insights) and applying those digital assets to subsequent use cases.

The Prioritization Matrix in Figure 3 is an excellent management tool for driving organizational alignment AND commitment around the organization’s top priority use cases.

Figure 3: Prioritization Matrix

Figure 3: Prioritization Matrix


See the blog “Prioritization Matrix: Aligning Business and IT On The Big Data Journey” for more details on how to use the Prioritization Matrix.

Step 4: Brainstorm and Prioritize Data Sources

Step 4 focuses on brainstorming and prioritizing the different data sources that support the top priority use cases. Since Data Science is about “identifying the variables and metrics that might be better predictors of business or operational performance”, it is important to have a process (we call it “Thinking Like A Data Scientist”) where the business stakeholders can collaborate with the data science team to identify and test different data sources to identify those that might yield the best predictive models (see Figure 4).

Figure 4: Mapping Data Sources to Use Cases

Figure 4: Mapping Data Sources to Use Cases


See the following blogs for more details on the data envisioning and prioritization processes:

Step 5: Determine Economic Value of Your Data

Step 5 focuses on linking the financial value of the use cases to the data sources (variables and metrics) that support the predictive capabilities necessary to successfully execute that use case. That is, the financial value of the use cases becomes the financial value that is then allocated or attributed to the support data sources (see Figure 5).

Figure 5: Determining the Economic Value of Your Data

Figure 5: Determining the Economic Value of Your Data


See the recently published research paper with the University of San Francisco titled “Applying Economic Concepts To Big Data To Determine The Financial Value Of The Organization’s Data And Analytics Research Paper” for details on how to determine the economic value of your organization’s data.


“Used well, big data changes the basis of competition in industry after industry.”

“Used well” means leveraging advanced analytics to uncover insights about your customers, products, services, operations and markets that the organization can use to optimize key operational processes, reduce compliance and security risks, uncover new revenue opportunities and create a more compelling, differentiated customer experience.

Remember: organizations do not need a big data strategy; they need a business strategy that incorporates big data.


The post 5 Steps to Building a Big Data Business Strategy appeared first on InFocus Blog | Dell EMC Services.

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.

Latest Stories
SYS-CON Events announced today that Massive Networks, that helps your business operate seamlessly with fast, reliable, and secure internet and network solutions, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. As a premier telecommunications provider, Massive Networks is headquartered out of Louisville, Colorado. With years of experience under their belt, their team of...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, will describe how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launchi...
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their ass...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
As popularity of the smart home is growing and continues to go mainstream, technological factors play a greater role. The IoT protocol houses the interoperability battery consumption, security, and configuration of a smart home device, and it can be difficult for companies to choose the right kind for their product. For both DIY and professionally installed smart homes, developers need to consider each of these elements for their product to be successful in the market and current smart homes.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...
SYS-CON Events announced today that N3N will exhibit at SYS-CON's @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data a...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, will provide a fun and simple way to introduce Machine Leaning to anyone and everyone. Together we will solve a machine learning problem and find an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intellige...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbui...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.