|By Fiona McNeill||
|August 21, 2014 07:00 PM EDT||
Connected cars, factory equipment and household products communicating over the Internet is increasingly becoming a reality – one that might soon elicit headlines like “Is the Internet of Things a big bust?”
That’s because it’s one thing to connect a device to the Internet and direct data back to the manufacturer or service provider. It’s another, to derive new information from those data streams. The ability to analyze data in the IoT is critical to designing better products, predicting maintenance issues, and even improving quality of life.
Understanding the Internet of Things
The Internet is no longer just a network of people using computers and smart devices to communicate with each other. In the not too distant future, everything from the factory floor to a city street will be connected to the Internet. Three out of four global business leaders are exploring the economic opportunities created by the Internet of Things (IoT), according to a report from the Economist.
This connectivity has the potential to allow enterprises to create groundbreaking new products and services. An early warning that a piece of equipment is failing faster than expected allows a manufacturer to redesign the equipment, and if needed, get a jumpstart on recalling defective products. This could eliminate many warranty claims, a larger recall and bad press.
A sprinkler system maker could leap ahead of the competition with a system programmed to sense soil dampness and compare that with current weather forecasts to decide whether to turn a sprinkler on. A savvy entrepreneur could market garbage cans that alert municipal staff when the can is nearly full. That information could be used to re-route trucks in the short-term and over time, to optimize sanitation employee schedules.
Whatever the innovation, adding intelligence to the IoT requires advanced analytics. The ability to process and analyze data in real time – as it streams from assets and across the network – is the key to taking advantage of the IoT.
Managing the Complexity Data Streams
An interesting example of IoT potential is already in the early stages of adoption by the auto industry. McKinsey Research suggests IoT technologies could save insurers and car owners $100 billion annually in accident reductions using embedded systems that detect imminent collisions - and then take evasive action. When you break apart what it would take to implement such game changing technology, the role of advanced analytics becomes clear. The data must be understood in real time, but the radar, laser and other sensor data alone isn’t enough to make an intelligent decision for the driver in that split second. It needs to know what is about to happen before it actually does happen. And to do that, it needs models that evaluate the near future scenario, rules that form the decision points of when the model scores are relevant, and prescribe actions based on well-understood patterns and historic scenario analysis. All of that data needs to be analyzed into models that live in the streams so they are assessing real-time conditions and can guide the car away from a pending accident.
Internet-connected sensors that are being embedded in everything from roadways to refrigerators will transmit so much information that it will be meaningless without robust analytics. Consider these examples:
Sensoring and smart meter programs can reduce energy consumption, but only if energy companies have sophisticated forecasting solutions that use the data to quickly reduce expensive last-minute power grid purchases.
Remote patient monitoring can provide convenient access to health care, raise its quality and save money. But if researchers don’t use the data to immediately understand the problem detected by enhanced sensors, added monitoring will simply drive up health costs with no added benefits.
Machine monitoring sensors can diagnose equipment issues and predict asset failure prior to service disruption. When connected to inventory systems, parts would be automatically ordered and field repair team schedules would be optimized across large regions. This only happens, however, if analytics are embedded throughout this process, recognizing an issue trend as it occurs, identifying the rate of asset lifetime depletion, specifying what’s needed from stock and of course, calculating the human resource needs
Analysis in the Internet of Things
Some of the common analytic techniques used today aren’t fast enough to work with IoT data streams. In traditional analysis, data is stored in a repository, tables, etc., and then analyzed. With streaming data, however, the algorithms and decision logic are stored and the data passes through them for analysis. This type of analysis makes it possible to identify and examine patterns of interest as the data is being created – in real time.
Instead of stream it, score it and store it, your organization needs to be able to stream it, score and then decide if you need to store it.
With advanced analytic techniques, data streaming moves beyond monitoring of existing conditions to evaluating future scenarios and examining complex questions - continuously. And because you have up to the fraction of a second information at your fingertips – you consistently know what could happen next, tweaking tactical activities and enriching decision strategies.
To achieve predictive abilities using IoT data, routines and algorithms are coded into software that reads the stream data at the device level or say, in a repository (typically cloud-based). Additionally, data normalization and business rules are also included in the programming, cleansing the stream data and defining the threshold conditions associated with patterns of interest defined for current and future scenarios. In addition to monitoring conditions and thresholds, you can build smart filters into the data streams from the IoT, to decide what should be kept for further analysis to assess likely future events and plan for countless what-if scenarios, or even what to archive vs. what to throw away.
Advanced and high-performance analytics that can work with streaming data are critical to realizing the potential of the Internet of Things. Without it you’ll soon see “Internet of No Thing” headlines on your favorite website.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
Apr. 23, 2017 03:30 PM EDT Reads: 1,849
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Apr. 23, 2017 03:30 PM EDT Reads: 2,381
SYS-CON Events announced today that Twistlock, the leading provider of cloud container security solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Twistlock is the industry's first enterprise security suite for container security. Twistlock's technology addresses risks on the host and within the application of the container, enabling enterprises to consistently enforce security policies, monitor...
Apr. 23, 2017 03:30 PM EDT Reads: 3,225
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyor belt between the Software Factory and production stages. Artifacts are ...
Apr. 23, 2017 03:15 PM EDT Reads: 708
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, pane...
Apr. 23, 2017 03:15 PM EDT Reads: 1,786
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across supply chain networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost and time for product recall as well as advance trade. Are you curious about Blockchain and how it can provide you with new opportunities for innovation and growth? In her session at 20th Cloud Exp...
Apr. 23, 2017 02:45 PM EDT Reads: 1,330
@ThingsExpo has been named the Most Influential ‘Smart Cities - IIoT' Account and @BigDataExpo has been named fourteenth by Right Relevance (RR), which provides curated information and intelligence on approximately 50,000 topics. In addition, Right Relevance provides an Insights offering that combines the above Topics and Influencers information with real time conversations to provide actionable intelligence with visualizations to enable decision making. The Insights service is applicable to eve...
Apr. 23, 2017 02:15 PM EDT Reads: 2,289
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
Apr. 23, 2017 02:00 PM EDT Reads: 1,539
SYS-CON Events announced today that Hitachi, the leading provider the Internet of Things and Digital Transformation, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Hitachi Data Systems, a wholly owned subsidiary of Hitachi, Ltd., offers an integrated portfolio of services and solutions that enable digital transformation through enhanced data management, governance, mobility and analytics. We help globa...
Apr. 23, 2017 01:45 PM EDT Reads: 1,749
SYS-CON Events announced today that Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Juniper Networks challenges the status quo with products, solutions and services that transform the economics of networking. The company co-innovates with customers and partners to deliver automated, scalable and secure network...
Apr. 23, 2017 01:15 PM EDT Reads: 4,653
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and 21st International Cloud Expo, which will take place in November in Silicon Valley, California.
Apr. 23, 2017 01:00 PM EDT Reads: 1,899
Developers want to create better apps faster. Static clouds are giving way to scalable systems, with dynamic resource allocation and application monitoring. You won't hear that chant from users on any picket line, but helping developers to create better apps faster is the mission of Lee Atchison, principal cloud architect and advocate at New Relic Inc., based in San Francisco. His singular job is to understand and drive the industry in the areas of cloud architecture, microservices, scalability ...
Apr. 23, 2017 01:00 PM EDT Reads: 3,150
Back in February of 2017, Andrew Clay Schafer of Pivotal tweeted the following: “seriously tho, the whole software industry is stuck on deployment when we desperately need architecture and telemetry.” Intrigue in a 140 characters. For me, I hear Andrew saying, “we’re jumping to step 5 before we’ve successfully completed steps 1-4.”
Apr. 23, 2017 12:45 PM EDT Reads: 1,263
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain.
Apr. 23, 2017 11:45 AM EDT Reads: 2,187
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTred processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Apr. 23, 2017 11:30 AM EDT Reads: 2,414