Welcome!

Related Topics: @ThingsExpo, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo, @BigDataExpo, SDN Journal

@ThingsExpo: Article

ARM Server to Transform #BigData to #IoT | @CloudExpo #DigitalTransformation

New Microserver computing platform offers compelling benefits for the right applications

A completely new computing platform is on the horizon. They're called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general.

What Is a Microserver...and What Isn't
Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some years to come - growing to over 20% of the server market by 2016 according to Oppenheimer ("Cloudy With A Chance of ARM" Oppenheimer Equity Research Industry Report).

According to Chris Piedmonte, CEO of Suvola Corporation - a software and services company focused on creating preconfigured and scalable Microserver appliances for deploying large-scale enterprise applications, "the Microserver market is poised to grow by leaps and bounds - because companies can leverage this kind of technology to deploy systems that offer 400% better cost-performance at half the total cost of ownership. These organizations will also benefit from the superior reliability, reduced space and power requirements, and lower cost of entry provided by Microserver platforms".

This technology might be poised to grown, but today, these Microservers aren't mainstream at all - having well under 1% of the server market. Few people know about them. And there is a fair amount of confusion in the marketplace. There isn't even agreement on what to call them: different people call them different things - Microserver, ARM Server, ARM-based Server and who knows what else.

To further confuse the issue, there are a number of products out there in the market that are called "Microservers" that aren't Microservers at all - for example the HP ProLiant MicroServer or the HP Moonshoot chassis. These products are smaller and use less power than traditional servers, but they are just a slightly different flavor of standard Intel/AMD server that we are all familiar with. Useful, but not at all revolutionary - and with a name that causes unfortunate confusion in the marketplace.

Specifically, a Microserver is a server that is based on "system-on-a-chip" (SoC) technology - where the CPU, memory and system I/O and such are all one single chip - not multiple components on a system board (or even multiple boards).

What Makes ARM Servers Revolutionary?
ARM Servers are an entirely new generation of server computing - and they will make serious inroads into the enterprise in the next few years. A serious innovation - revolutionary, not evolutionary.

These new ARM Server computing platforms are an entire system - multiple CPU cores, memory controllers, input/output controllers for SATA, USB, PCIe and others, high-speed network interconnect switches, etc. - all on a SINGLE chip measuring only one square inch. This is hyperscale integration technology at work.

To help put this into context, you can fit 72 quad-core ARM Servers into the space used by a single traditional server board.

Today's traditional server racks are typically packed with boards based on Intel XEON or AMD Opteron chips and are made up of a myriad of discrete components. They're expensive, powerful, power-hungry, use up a considerable amount of space, and can quickly heat up a room to the point where you might think you're in a sauna.

In contrast, the ARM Servers with their SoC design are small, very energy efficient, reliable, scalable - and incredibly well-suited for a wide variety of mainstream computing tasks dealing with large numbers of users, data and applications (like Web services, data crunching, media streaming, etc.). The SoC approach of putting an entire system on a chip, results in a computer that can operate on as little as 1.5 watts of power.

Add in memory and a solid-state "disk drive" and you could have an entire server that runs on under 10 watts of power. For example, Calxeda's ECX-1000 quad-core ARM Server node with built-in Ethernet and SATA controllers, and 4GB of memory uses 5 watts at full power. In comparison, my iPhone charger is 7 watts and the power supply for the PC on my desk is 650 watts (perhaps that explains the $428 electric bill I got last month).

ARM Server Microserver

Realistically, these ARM Servers use about 1/10th the power, and occupy considerably less than 1/10th the space of traditional rack-mounted servers (for systems of equivalent computing power). And at an acquisition price of about half of what a traditional system costs.

And they are designed to scale - the Calxeda ECX-1000 ARM Servers are packaged up into "Energy Cards" - composed of four quad-core chips and 16 SATA ports. They are designed with scalability in mind - they embed an 80 gigabit per second interconnect switch, which allows you to easily connect potentially thousands of nodes without all the cabling inherent in traditional rack-mounted systems (a large Intel-based system could have upwards of 2,000 cables). This also provides for extreme performance - node to node communication occurs on the order of 200 nanoseconds.

You can have four complete ARM Servers on a board that is only ten inches long and uses only about 20 watts of power at full speed - that's revolutionary.

How Do ARM Servers Translate into Business Benefits?
When you account for reduced computing center operations costs, lower acquisition costs, increased reliability due to simpler construction / fewer parts, and less administrative cost as a result of fewer cables and components, we're talking about systems that could easily cost 70% less to own and operate.

If you toss in the cost to actually BUILD the computing center and not just "operate it", then the cost advantage is even larger. That's compelling - especially to larger companies that spend millions of dollars a year building and operating computing centers. Facebook, for example, has been spending about half a billion (yes, with a "b") dollars a year lately building and equipping their computing centers. Mobile devices are driving massive spending in this area - and in many cases, these are applications which are ideal for ARM Server architectures.

Why Don't I See More ARM Servers?
So - if all this is true, why do Microservers have such a negligible market share of the Server market?

My enthusiasm for ARM Servers is in their potential. This is still an early-stage technology and Microserver hardware really has only been available since the last half of 2012. I doubt any companies are going to trade in all their traditional rack servers for Microservers this month. The "eco-system" for ARM Servers isn't fully developed yet. And ARM Servers aren't the answer to every computing problem - the hardware has some limitations (it's 32 bit, at least for now). And it's a platform better suited for some classes of computing than others. Oh, and although it runs various flavors of Linux, it doesn't run Windows - whether that is a disadvantage depends on your individual perspective.

Microservers in Your Future?
Irrespective of these temporary shortcomings, make no mistake - this is a revolutionary shift in the way that server systems will be (and should be) designed. Although you personally may never own one of these systems, within the next couple of years, you will make use of ARM Servers all the time - as they have the potential to shrink the cost of Cloud Computing, "Big Data", media streaming and any kind of Web computing services to a fraction of the cost of what they are today.

Keep your eye on this little technology - it's going to be big.


Note: The author of this article works for Dell. The opinions stated are his own personal opinions vs. those of his employer.

More Stories By Hollis Tibbetts

Hollis Tibbetts, or @SoftwareHollis as his 50,000+ followers know him on Twitter, is listed on various “top 100 expert lists” for a variety of topics – ranging from Cloud to Technology Marketing, Hollis is by day Evangelist & Software Technology Director at Dell Software. By night and weekends he is a commentator, speaker and all-round communicator about Software, Data and Cloud in their myriad aspects. You can also reach Hollis on LinkedIn – linkedin.com/in/SoftwareHollis. His latest online venture is OnlineBackupNews - a free reference site to help organizations protect their data, applications and systems from threats. Every year IT Downtime Costs $26.5 Billion In Lost Revenue. Even with such high costs, 56% of enterprises in North America and 30% in Europe don’t have a good disaster recovery plan. Online Backup News aims to make sure you all have the news and tips needed to keep your IT Costs down and your information safe by providing best practices, technology insights, strategies, real-world examples and various tips and techniques from a variety of industry experts.

Hollis is a regularly featured blogger at ebizQ, a venue focused on enterprise technologies, with over 100,000 subscribers. He is also an author on Social Media Today "The World's Best Thinkers on Social Media", and maintains a blog focused on protecting data: Online Backup News.
He tweets actively as @SoftwareHollis

Additional information is available at HollisTibbetts.com

All opinions expressed in the author's articles are his own personal opinions vs. those of his employer.

Latest Stories
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you ...
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
China Unicom exhibit at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. China United Network Communications Group Co. Ltd ("China Unicom") was officially established in 2009 on the basis of the merger of former China Netcom and former China Unicom. China Unicom mainly operates a full range of telecommunications services including mobile broadband (GSM, WCDMA, LTE FDD, TD-LTE), fixed-line broadband, ICT, data communica...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, will discuss some of the security challenges of the IoT infrastructure and relate how these aspects impact Smart Living. The material will be delivered i...
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...