|By Hollis Tibbetts||
|November 4, 2016 09:00 AM EDT||
A completely new computing platform is on the horizon. They're called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general.
What Is a Microserver...and What Isn't
Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some years to come - growing to over 20% of the server market by 2016 according to Oppenheimer ("Cloudy With A Chance of ARM" Oppenheimer Equity Research Industry Report).
According to Chris Piedmonte, CEO of Suvola Corporation - a software and services company focused on creating preconfigured and scalable Microserver appliances for deploying large-scale enterprise applications, "the Microserver market is poised to grow by leaps and bounds - because companies can leverage this kind of technology to deploy systems that offer 400% better cost-performance at half the total cost of ownership. These organizations will also benefit from the superior reliability, reduced space and power requirements, and lower cost of entry provided by Microserver platforms".
This technology might be poised to grown, but today, these Microservers aren't mainstream at all - having well under 1% of the server market. Few people know about them. And there is a fair amount of confusion in the marketplace. There isn't even agreement on what to call them: different people call them different things - Microserver, ARM Server, ARM-based Server and who knows what else.
To further confuse the issue, there are a number of products out there in the market that are called "Microservers" that aren't Microservers at all - for example the HP ProLiant MicroServer or the HP Moonshoot chassis. These products are smaller and use less power than traditional servers, but they are just a slightly different flavor of standard Intel/AMD server that we are all familiar with. Useful, but not at all revolutionary - and with a name that causes unfortunate confusion in the marketplace.
Specifically, a Microserver is a server that is based on "system-on-a-chip" (SoC) technology - where the CPU, memory and system I/O and such are all one single chip - not multiple components on a system board (or even multiple boards).
What Makes ARM Servers Revolutionary?
ARM Servers are an entirely new generation of server computing - and they will make serious inroads into the enterprise in the next few years. A serious innovation - revolutionary, not evolutionary.
These new ARM Server computing platforms are an entire system - multiple CPU cores, memory controllers, input/output controllers for SATA, USB, PCIe and others, high-speed network interconnect switches, etc. - all on a SINGLE chip measuring only one square inch. This is hyperscale integration technology at work.
To help put this into context, you can fit 72 quad-core ARM Servers into the space used by a single traditional server board.
Today's traditional server racks are typically packed with boards based on Intel XEON or AMD Opteron chips and are made up of a myriad of discrete components. They're expensive, powerful, power-hungry, use up a considerable amount of space, and can quickly heat up a room to the point where you might think you're in a sauna.
In contrast, the ARM Servers with their SoC design are small, very energy efficient, reliable, scalable - and incredibly well-suited for a wide variety of mainstream computing tasks dealing with large numbers of users, data and applications (like Web services, data crunching, media streaming, etc.). The SoC approach of putting an entire system on a chip, results in a computer that can operate on as little as 1.5 watts of power.
Add in memory and a solid-state "disk drive" and you could have an entire server that runs on under 10 watts of power. For example, Calxeda's ECX-1000 quad-core ARM Server node with built-in Ethernet and SATA controllers, and 4GB of memory uses 5 watts at full power. In comparison, my iPhone charger is 7 watts and the power supply for the PC on my desk is 650 watts (perhaps that explains the $428 electric bill I got last month).
Realistically, these ARM Servers use about 1/10th the power, and occupy considerably less than 1/10th the space of traditional rack-mounted servers (for systems of equivalent computing power). And at an acquisition price of about half of what a traditional system costs.
And they are designed to scale - the Calxeda ECX-1000 ARM Servers are packaged up into "Energy Cards" - composed of four quad-core chips and 16 SATA ports. They are designed with scalability in mind - they embed an 80 gigabit per second interconnect switch, which allows you to easily connect potentially thousands of nodes without all the cabling inherent in traditional rack-mounted systems (a large Intel-based system could have upwards of 2,000 cables). This also provides for extreme performance - node to node communication occurs on the order of 200 nanoseconds.
You can have four complete ARM Servers on a board that is only ten inches long and uses only about 20 watts of power at full speed - that's revolutionary.
How Do ARM Servers Translate into Business Benefits?
When you account for reduced computing center operations costs, lower acquisition costs, increased reliability due to simpler construction / fewer parts, and less administrative cost as a result of fewer cables and components, we're talking about systems that could easily cost 70% less to own and operate.
If you toss in the cost to actually BUILD the computing center and not just "operate it", then the cost advantage is even larger. That's compelling - especially to larger companies that spend millions of dollars a year building and operating computing centers. Facebook, for example, has been spending about half a billion (yes, with a "b") dollars a year lately building and equipping their computing centers. Mobile devices are driving massive spending in this area - and in many cases, these are applications which are ideal for ARM Server architectures.
Why Don't I See More ARM Servers?
So - if all this is true, why do Microservers have such a negligible market share of the Server market?
My enthusiasm for ARM Servers is in their potential. This is still an early-stage technology and Microserver hardware really has only been available since the last half of 2012. I doubt any companies are going to trade in all their traditional rack servers for Microservers this month. The "eco-system" for ARM Servers isn't fully developed yet. And ARM Servers aren't the answer to every computing problem - the hardware has some limitations (it's 32 bit, at least for now). And it's a platform better suited for some classes of computing than others. Oh, and although it runs various flavors of Linux, it doesn't run Windows - whether that is a disadvantage depends on your individual perspective.
Microservers in Your Future?
Irrespective of these temporary shortcomings, make no mistake - this is a revolutionary shift in the way that server systems will be (and should be) designed. Although you personally may never own one of these systems, within the next couple of years, you will make use of ARM Servers all the time - as they have the potential to shrink the cost of Cloud Computing, "Big Data", media streaming and any kind of Web computing services to a fraction of the cost of what they are today.
Keep your eye on this little technology - it's going to be big.
Note: The author of this article works for Dell. The opinions stated are his own personal opinions vs. those of his employer.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Feb. 21, 2017 07:45 AM EST Reads: 275
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Feb. 21, 2017 07:45 AM EST
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change the...
Feb. 21, 2017 07:30 AM EST Reads: 4,164
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Feb. 21, 2017 07:15 AM EST Reads: 112
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...
Feb. 21, 2017 07:00 AM EST Reads: 5,234
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Feb. 21, 2017 04:30 AM EST Reads: 1,430
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
Feb. 21, 2017 03:45 AM EST Reads: 883
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Feb. 21, 2017 03:30 AM EST Reads: 3,598
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet and...
Feb. 21, 2017 03:30 AM EST Reads: 7,575
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 21, 2017 03:00 AM EST Reads: 1,617
910Telecom exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and exchanges.
Feb. 21, 2017 02:45 AM EST Reads: 1,364
Whether you like it or not, DevOps is on track for a remarkable alliance with security. The SEC didn’t approve the merger. And your boss hasn’t heard anything about it. Yet, this unruly triumvirate will soon dominate and deliver DevSecOps faster, cheaper, better, and on an unprecedented scale. In his session at DevOps Summit, Frank Bunger, VP of Customer Success at ScriptRock, discussed how this cathartic moment will propel the DevOps movement from such stuff as dreams are made on to a practic...
Feb. 21, 2017 02:00 AM EST Reads: 4,584
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Feb. 21, 2017 01:15 AM EST Reads: 3,293
The modern software development landscape consists of best practices and tools that allow teams to deliver software in a near-continuous manner. By adopting a culture of automation, measurement and sharing, the time to ship code has been greatly reduced, allowing for shorter release cycles and quicker feedback from customers and users. Still, with all of these tools and methods, how can teams stay on top of what is taking place across their infrastructure and codebase? Hopping between services a...
Feb. 21, 2017 01:00 AM EST Reads: 6,284
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Feb. 21, 2017 12:30 AM EST Reads: 732