|By Fiona McNeill||
|August 21, 2014 07:00 PM EDT||
Connected cars, factory equipment and household products communicating over the Internet is increasingly becoming a reality – one that might soon elicit headlines like “Is the Internet of Things a big bust?”
That’s because it’s one thing to connect a device to the Internet and direct data back to the manufacturer or service provider. It’s another, to derive new information from those data streams. The ability to analyze data in the IoT is critical to designing better products, predicting maintenance issues, and even improving quality of life.
Understanding the Internet of Things
The Internet is no longer just a network of people using computers and smart devices to communicate with each other. In the not too distant future, everything from the factory floor to a city street will be connected to the Internet. Three out of four global business leaders are exploring the economic opportunities created by the Internet of Things (IoT), according to a report from the Economist.
This connectivity has the potential to allow enterprises to create groundbreaking new products and services. An early warning that a piece of equipment is failing faster than expected allows a manufacturer to redesign the equipment, and if needed, get a jumpstart on recalling defective products. This could eliminate many warranty claims, a larger recall and bad press.
A sprinkler system maker could leap ahead of the competition with a system programmed to sense soil dampness and compare that with current weather forecasts to decide whether to turn a sprinkler on. A savvy entrepreneur could market garbage cans that alert municipal staff when the can is nearly full. That information could be used to re-route trucks in the short-term and over time, to optimize sanitation employee schedules.
Whatever the innovation, adding intelligence to the IoT requires advanced analytics. The ability to process and analyze data in real time – as it streams from assets and across the network – is the key to taking advantage of the IoT.
Managing the Complexity Data Streams
An interesting example of IoT potential is already in the early stages of adoption by the auto industry. McKinsey Research suggests IoT technologies could save insurers and car owners $100 billion annually in accident reductions using embedded systems that detect imminent collisions - and then take evasive action. When you break apart what it would take to implement such game changing technology, the role of advanced analytics becomes clear. The data must be understood in real time, but the radar, laser and other sensor data alone isn’t enough to make an intelligent decision for the driver in that split second. It needs to know what is about to happen before it actually does happen. And to do that, it needs models that evaluate the near future scenario, rules that form the decision points of when the model scores are relevant, and prescribe actions based on well-understood patterns and historic scenario analysis. All of that data needs to be analyzed into models that live in the streams so they are assessing real-time conditions and can guide the car away from a pending accident.
Internet-connected sensors that are being embedded in everything from roadways to refrigerators will transmit so much information that it will be meaningless without robust analytics. Consider these examples:
Sensoring and smart meter programs can reduce energy consumption, but only if energy companies have sophisticated forecasting solutions that use the data to quickly reduce expensive last-minute power grid purchases.
Remote patient monitoring can provide convenient access to health care, raise its quality and save money. But if researchers don’t use the data to immediately understand the problem detected by enhanced sensors, added monitoring will simply drive up health costs with no added benefits.
Machine monitoring sensors can diagnose equipment issues and predict asset failure prior to service disruption. When connected to inventory systems, parts would be automatically ordered and field repair team schedules would be optimized across large regions. This only happens, however, if analytics are embedded throughout this process, recognizing an issue trend as it occurs, identifying the rate of asset lifetime depletion, specifying what’s needed from stock and of course, calculating the human resource needs
Analysis in the Internet of Things
Some of the common analytic techniques used today aren’t fast enough to work with IoT data streams. In traditional analysis, data is stored in a repository, tables, etc., and then analyzed. With streaming data, however, the algorithms and decision logic are stored and the data passes through them for analysis. This type of analysis makes it possible to identify and examine patterns of interest as the data is being created – in real time.
Instead of stream it, score it and store it, your organization needs to be able to stream it, score and then decide if you need to store it.
With advanced analytic techniques, data streaming moves beyond monitoring of existing conditions to evaluating future scenarios and examining complex questions - continuously. And because you have up to the fraction of a second information at your fingertips – you consistently know what could happen next, tweaking tactical activities and enriching decision strategies.
To achieve predictive abilities using IoT data, routines and algorithms are coded into software that reads the stream data at the device level or say, in a repository (typically cloud-based). Additionally, data normalization and business rules are also included in the programming, cleansing the stream data and defining the threshold conditions associated with patterns of interest defined for current and future scenarios. In addition to monitoring conditions and thresholds, you can build smart filters into the data streams from the IoT, to decide what should be kept for further analysis to assess likely future events and plan for countless what-if scenarios, or even what to archive vs. what to throw away.
Advanced and high-performance analytics that can work with streaming data are critical to realizing the potential of the Internet of Things. Without it you’ll soon see “Internet of No Thing” headlines on your favorite website.
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Oct. 28, 2016 04:30 AM EDT Reads: 1,964
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Oct. 28, 2016 04:15 AM EDT Reads: 4,140
Enterprises have been using both Big Data and virtualization for years. Until recently, however, most enterprises have not combined the two. Big Data's demands for higher levels of performance, the ability to control quality-of-service (QoS), and the ability to adhere to SLAs have kept it on bare metal, apart from the modern data center cloud. With recent technology innovations, we've seen the advantages of bare metal erode to such a degree that the enhanced flexibility and reduced costs that cl...
Oct. 28, 2016 04:15 AM EDT Reads: 642
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
Oct. 28, 2016 04:15 AM EDT Reads: 1,536
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Oct. 28, 2016 04:15 AM EDT Reads: 1,146
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
Oct. 28, 2016 02:45 AM EDT Reads: 1,781
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Oct. 28, 2016 02:30 AM EDT Reads: 1,178
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Oct. 28, 2016 02:00 AM EDT Reads: 4,386
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
Oct. 28, 2016 02:00 AM EDT Reads: 34,366
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Oct. 28, 2016 01:15 AM EDT Reads: 2,223
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 28, 2016 01:00 AM EDT Reads: 1,209
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
Oct. 28, 2016 12:45 AM EDT Reads: 4,261
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 28, 2016 12:45 AM EDT Reads: 1,521
Between the mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at @DevOpsSummit at 19th Cloud Expo, Charles Kendrick, CTO at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how business and deve...
Oct. 28, 2016 12:15 AM EDT Reads: 1,627
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, will provide economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session will also include a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
Oct. 28, 2016 12:00 AM EDT Reads: 2,197