Welcome!

Related Topics: @BigDataExpo, Artificial Intelligence, @CloudExpo, @ThingsExpo

@BigDataExpo: Blog Feed Post

Demystifying #DataScience | @CloudExpo #BigData #AI #ArtificialIntelligence

Data science is about identifying those variables and metrics that might be better predictors of performance

[Opening Scene]: Billy Dean is pacing the office. He’s struggling to keep his delivery trucks at full capacity and on the road. Random breakdowns, unexpected employee absences, and unscheduled truck maintenance are impacting bookings, revenues and ultimately customer satisfaction. He keeps hearing from his business customers how they are leveraging data science to improve their business operations. Billy Dean starts to wonder if data science can help him. As he contemplates what data science can do for him, he slowly drifts off to sleep, and visions of Data Science starts dancing in his head…

[Poof! Suddenly Wizard Wei appears]: Hi, I’m your data science wizard to help alleviate your data science concerns. I don’t understand why folks try to make the data science discussion complicated. Let’s start simple with a simple definition of data science:

Data science is about identifying those variables and metrics that might be better predictors of performance

The key to a successful analytical model is having a robust set of variables against which to test for their predictive capabilities. And the key to having a robust set of variables from which to test is to get the business users engaged early in the process.

[A confused Billy Dean]: Okay, but I’m still confused. I mean, how does this really apply to my business?

[A patient Wizard Wei]: Well, let’s say that you are trying to predict which of your routes are likely to have under-capacity loads so that you can combine loads. In order to identify those variables that might be better predictors of under-capacity routes, you might ask your business users:

What data might you want to have in order to predict under-capacity routes?

The business users are likely to come up with a wide variety of variables, including:

Customer name Ship to location Customer industry
Building permits Customer tenure Change in customer size
Customer stock price Customer D&B rating Types of products hauled
Time of year Seasonality/Holidays Day of week
Traffic Weather Local Events
Distance from distribution center Open headcount on Indeed.com Tenure of logistics manager

The Data Science team will then gather these variables, perform some data transformations and enrichment, and then look for variables and combinations of variables that yield the best predictive results regarding under-capacity routes (see Figure 1).

Figure 1: Data Science Process

Role of Artificial Intelligence
[A less confuse Billy Dean]:
Ah, I think I understand, but what about all this talk about artificial intelligence? From some of these commercials on TV, it appears that robots with artificial intelligence will be ruling the world. Can you say Skynet?

[A still patient Wizard Wei]: Ah, that’s just marketing. Artificial intelligence is just one of many different tools in the predictive analytics kit bag of a data scientist. But artificial intelligence – while embracing some very sophisticated mathematical, data enrichment and computing techniques – is really pretty straightforward. All artificial intelligence is trying to do is to find and quantify relationships between variables buried in large data sets (see Figure 2).

Figure 2: Understanding Artificial Intelligence

[An inquisitive Billy Dean]: Okay, I’m starting to get it, but there seems to be some many
different analytic and predictive algorithms from which to choose. How does the business user know where to start?

[A growing frustrated Wizard Wei]: Ah, that’s the secret to the process. Business users don’t need to know which algorithms to use; they need to be able to identify those variables that might be better predictors of performance. It is up to the data science team to determine which variables are the most appropriate by testing the different algorithms.

Data Mining, Machine Learning and Artificial Intelligence (including areas such as cognitive computing, statistics, neural networks, text analytics, video analytics, etc.) are all members of the broader category of data science tools. Our data scientist team has experts in each of these areas, though no one data scientist is an expert at all of them (in spite of what they tell me). The different data science tools are used in different scenarios for different needs. Think of one of your mechanics. They have a large toolbox full of different tools. They determine what tools to use to fix a truck based upon the problem they are trying to solve. That’s exactly what a data scientist is doing, just with a different toolbox of algorithms.

No single algorithm is best over whole domain; so different algorithms are needed to cover different domains. Often combinations of algorithms are used in order to achieve the best results. To be honest, it’s like a giant jigsaw puzzle with the data science team constantly testing different combinations of metrics, data enrichment and algorithms until they find the combination that yields the best results.

[An enlightened Billy Dean]: I think I’ve finally got it. All of these different algorithms and techniques are just trying to help predict what is likely to happen so that I can make better operational and customer issues. But what’s the realm of what’s possible with data and analytics; I mean, how effective can my organization become at leveraging data and analytics to power my business?

[A proud Wizard Wei]: Great question, and the heart of the big data and data science conversation. Figure 3 shows how you could use these different data science tools to progress up the Big Data Business Model Maturity Index; to transition from running your business on Descriptive analytics that tell you what happened (Monitoring stage) to Predictive analytics that tell you what is likely to happen (Insights stage) to Prescriptive analytics that tell you what they should do (Optimization stage).

Figure 3: Leveraging Artificial Intelligence to drive Business Value

In the end, the data and the analytics are only useful if they help you optimize key operational processes, reduce compliance and security risks, uncover new revenue opportunities and create a more compelling, more prescriptive customer engagement. In the end, data and analytics are all about your business.

[A satisfied Billy Dean]: That’s great Wizard Wei! Thanks for your help!

Now, what can you do about my taxes…

To learn more about “Demystifying Data Science”, come to my Dell EMC World session: “Demystifying Data Science: A Pragmatic Guide To Building Big Data Use Cases” See you there!!

The post Demystifying Data Science appeared first on InFocus Blog | Dell EMC Services.

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.

Latest Stories
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...