Welcome!

Related Topics: @ThingsExpo, Java IoT, Linux Containers, Containers Expo Blog, @CloudExpo, @BigDataExpo

@ThingsExpo: Article

Putting Things to Work in the "Internet of Things"

The Internet is no longer just a network of people using computers and smart devices to communicate with each other

Connected cars, factory equipment and household products communicating over the Internet is increasingly becoming a reality – one that might soon elicit headlines like “Is the Internet of Things a big bust?”

That’s because it’s one thing to connect a device to the Internet and direct data back to the manufacturer or service provider. It’s another, to derive new information from those data streams. The ability to analyze data in the IoT is critical to designing better products, predicting maintenance issues, and even improving quality of life.

Understanding the Internet of Things
The Internet is no longer just a network of people using computers and smart devices to communicate with each other. In the not too distant future, everything from the factory floor to a city street will be connected to the Internet. Three out of four global business leaders are exploring the economic opportunities created by the Internet of Things (IoT), according to a report from the Economist.[1]

This connectivity has the potential to allow enterprises to create groundbreaking new products and services. An early warning that a piece of equipment is failing faster than expected allows a manufacturer to redesign the equipment, and if needed, get a jumpstart on recalling defective products. This could eliminate many warranty claims, a larger recall and bad press.

A sprinkler system maker could leap ahead of the competition with a system programmed to sense soil dampness and compare that with current weather forecasts to decide whether to turn a sprinkler on. A savvy entrepreneur could market garbage cans that alert municipal staff when the can is nearly full. That information could be used to re-route trucks in the short-term and over time, to optimize sanitation employee schedules.

Whatever the innovation, adding intelligence to the IoT requires advanced analytics. The ability to process and analyze data in real time – as it streams from assets and across the network – is the key to taking advantage of the IoT.

Managing the Complexity Data Streams
An interesting example of IoT potential is already in the early stages of adoption by the auto industry. McKinsey Research suggests[2] IoT technologies could save insurers and car owners $100 billion annually in accident reductions using embedded systems that detect imminent collisions - and then take evasive action. When you break apart what it would take to implement such game changing technology, the role of advanced analytics becomes clear. The data must be understood in real time, but the radar, laser and other sensor data alone isn’t enough to make an intelligent decision for the driver in that split second. It needs to know what is about to happen before it actually does happen.  And to do that, it needs models that evaluate the near future scenario, rules that form the decision points of when the model scores are relevant, and prescribe actions based on well-understood patterns and historic scenario analysis. All of that data needs to be analyzed into models that live in the streams so they are assessing real-time conditions and can guide the car away from a pending accident.

Internet-connected sensors that are being embedded in everything from roadways to refrigerators will transmit so much information that it will be meaningless without robust analytics. Consider these examples:

Sensoring and smart meter programs can reduce energy consumption, but only if energy companies have sophisticated forecasting solutions that use the data to quickly reduce expensive last-minute power grid purchases.

Remote patient monitoring can provide convenient access to health care, raise its quality and save money. But if researchers don’t use the data to immediately understand the problem detected by enhanced sensors, added monitoring will simply drive up health costs with no added benefits.

Machine monitoring sensors can diagnose equipment issues and predict asset failure prior to service disruption. When connected to inventory systems, parts would be automatically ordered and field repair team schedules would be optimized across large regions. This only happens, however, if analytics are embedded throughout this process, recognizing an issue trend as it occurs, identifying the rate of asset lifetime depletion, specifying what’s needed from stock and of course, calculating the human resource needs

Analysis in the Internet of Things
Some of the common analytic techniques used today aren’t fast enough to work with IoT data streams.  In traditional analysis, data is stored in a repository, tables, etc., and then analyzed. With streaming data, however, the algorithms and decision logic are stored and the data passes through them for analysis. This type of analysis makes it possible to identify and examine patterns of interest as the data is being created – in real time.

Instead of stream it, score it and store it, your organization needs to be able to stream it, score and then decide if you need to store it.

With advanced analytic techniques, data streaming moves beyond monitoring of existing conditions to evaluating future scenarios and examining complex questions - continuously.   And because you have up to the fraction of a second information at your fingertips – you consistently know what could happen next, tweaking tactical activities and enriching decision strategies.

To achieve predictive abilities using IoT data,  routines and algorithms are coded into software that reads the stream data at the device level or say, in a repository (typically cloud-based).  Additionally, data normalization and business rules are also included in the programming, cleansing the stream data and defining the threshold conditions associated with patterns of interest defined for current and future scenarios. In addition to monitoring conditions and thresholds, you can build smart filters into the data streams from the IoT, to decide what should be kept for further analysis to assess likely future events and plan for countless what-if scenarios, or even what to archive vs. what to throw away.

Advanced and high-performance analytics that can work with streaming data are critical to realizing the potential of the Internet of Things. Without it you’ll soon see “Internet of No Thing” headlines on your favorite website.

References

  1. The Internet of Things, Business Index
  2. The Internet of Things, McKinsey Quarterly

More Stories By Fiona McNeill

Fiona McNeill is the Global Product Marketing Manager at SAS. With a background in applying analytics to real-world business scenarios, she focuses on the automation of analytic insight in both business and application processing. Having been at SAS for over 15 years, she has worked with organizations across a variety of industries, understanding their business and helping them derive tangible benefit from their strategic use of technology. She is coauthor of the book Heuristics in Analytics: A Practical Perspective of What Influences Our Analytical World.

Latest Stories
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...