Welcome!

Related Topics: @ThingsExpo, Java IoT, Linux Containers, Open Source Cloud, Containers Expo Blog, @CloudExpo

@ThingsExpo: Blog Feed Post

Shifting The Storage Paradigm | Part 3: Object Storage Enables IoT

Object storage has emerged as the logical paradigm for storing the IoT data

Previously in this series, I explained the evolution of unstructured data and how storage requirements have changed over the past decade due to changes from above and below: the massive growth in unstructured data, mostly immutable data, requires cost-efficient, easy-to-scale storage architectures without the complexity of file systems. I noted that object storage was designed for this purpose, and that in addition to scalability and efficiency benefits, object storage also provides great benefits when it comes to accessibility as REST and other protocols make it very easy for applications to connect to the storage directly and give users access to their data through all sorts of mobile and web applications.

I also explained how information-sensing devices are not exclusive to scientific analytics environments: think of cameras and smart phones but also cheap network cameras for home security, thermostats that warm up our houses when we are on our way home, smart fridges or watering devices that allow us to keep our plants healthy and happy, even when we are on a vacation. This wave of innovation based on the capabilities to generate, process and leverage data in apps and devices is now popularly called The Internet of Things (IoT).

IoT is not a new concept. Wikipedia says: “The term Internet of Things was proposed by Kevin Ashton in 1999 though the concept has been discussed in the literature since at least 1991.” In its early stages, the concept relates to the use of radio-frequency identification (RFID) and how “if all objects and people in daily life were equipped with identifiers, they could be managed and inventoried by computers.” Finally, Wikipedia adds that, “Equipping all objects in the world with minuscule identifying devices or machine-readable identifiers could transform daily life.” And this is exactly what is happening today and also what makes IoT more important than ever.

Apple retail stores are full of “gadgets” that can make our lives easier, healthier and more enjoyable. Google has jumped on the gadget bandwagon with Google Glasses. And in terms of sizing this gadget market, Gartner is predicting that there will be over 25 billion IoT devices by 2020.

So what does this have to do with the topic at hand – Object Storage?

The one thing all those IoT devices have in common is that they log, generate and process data and turn it into information that can help us to keep track of our workouts, optimize energy consumption or bring our household automation (which sounds so much better in Italian: “domotica”) to the next level.

The fact that all these IoT devices are connected to the Internet means that more and more data will be uploaded from those devices. A lot of it is very small data, but from a volume point of view, if we take the sum of all the data those devices are generating, we are talking about exabytes and exabytes of information in the form of unstructured data – almost too much to fathom!

Traditional storage is simply not capable of handling these types of data. So, Object storage has emerged as the logical paradigm for storing such data.  Designed for large volumes of data, it supports distributed environments and the applications that run on these devices can integrated directly with the storage layer. Not all object storage platforms qualify for IoT data, however. The reasons for this are:

  • Small files in particular are a challenge for many object storage platforms: NoFS (No file System) becomes a key requirement
  • Performance needs to scale in all dimensions, especially IOPS and latency.
  • Different data types and different applications require different data protection mechanisms, as well as flexible architectures.

IoT has been around for a while but things are only getting started. Innovation doesn’t stand still, so who knows how data storage requirements will evolve over the next decade?

So here ends my three-blog series, which covered the evolution of digital data over the past 4 decades and illustrated how storage platforms have evolved to meet the requirements of new data types and applications. Object storage is designed for long-term storage strategies but we understand it will probably not be the end point.

More Stories By Tom Leyden

Tom Leyden is VP Product Marketing at Scality. Scality was founded in 2009 by a team of entrepreneurs and technologists. The idea wasn’t storage, per se. When the Scality team talked to the initial base of potential customers, the customers wanted a system that could “route” data to and from individual users in the most scalable, efficient way possible. And so began a non-traditional approach to building a storage system that no one had imagined before. No one thought an object store could have enough performance for all the files and attachments of millions of users. No one thought a system could remain up and running through software upgrades, hardware failures, capacity expansions, and even multiple hardware generations coexisting. And no one believed you could do all this and scale to petabytes of content and billions of objects in pure software.

Latest Stories
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear these words all day every day... lofty goals but how do we make it real? Add to that, that simply put, people don't like change. But what if we could implement and utilize these enterprise tools in a fast and "Non-Disruptive" way, enabling us to glean insights about our business, identify and reduce exposure, risk and liability, and secure business continuity?
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
Machine learning provides predictive models which a business can apply in countless ways to better understand its customers and operations. Since machine learning was first developed with flat, tabular data in mind, it is still not widely understood: when does it make sense to use graph databases and machine learning in combination? This talk tackles the question from two ends: classifying predictive analytics methods and assessing graph database attributes. It also examines the ongoing lifecycl...
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Transformation Abstract Encryption and privacy in the cloud is a daunting yet essential task for both security practitioners and application developers, especially as applications continue moving to the cloud at an exponential rate. What are some best practices and processes for enterprises to follow that balance both security and ease of use requirements? What technologies are available to empower enterprises with code, data and key protection from cloud providers, system administrators, inside...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
Consumer-driven contracts are an essential part of a mature microservice testing portfolio enabling independent service deployments. In this presentation we'll provide an overview of the tools, patterns and pain points we've seen when implementing contract testing in large development organizations.
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...