Welcome!

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Containers Expo Blog, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Consolidating Big Data

How to make your data center more cost-effective while improving performance

Cloud computing has opened the doors to a vast array of online services. With the emergence of new cloud technologies, both public and private companies are seeing increases in performance gains, elasticity and convenience. However, maintaining a competitive advantage has become increasingly difficult. Service providers are taking a closer look at their data storage infrastructure for ways to improve performance and cut costs.

If the status quo remains, maintaining low-cost cloud services will become increasingly difficult. Service providers will incur higher costs, while consumers become burdened with storage capacity restrictions. Such obstacles are influencing service providers to find new ways to scale cost-effectively and increase performance in the data center.

Cost-Benefit Analysis
In response to the increase of online account activity, service providers are consolidating their data centers to a centralized environment. By doing so, they are able to cut costs while increasing efficiency, allowing data to be accessible from any location. Centralizing equipment enables providers the ability to deliver enhanced Internet connections, performance and reliability.

However, with these added benefits also come disadvantages. For instance, scalability becomes more expensive and difficult to achieve. Improving efficiency within a centralized data center requires the purchase of additional high-performance, specialized equipment, which increases costs and energy consumption, challenging endeavors to control at scale. In an economy where cost-cutting is becoming a necessity for large and small enterprises alike, these added expenses are unacceptable.

Characteristics of the Cloud
Solving performance problems, like data bottlenecks, is a growing concern for cloud providers who must oversee significantly more users and accompanying performance demands, than do enterprises. Although the average user of an enterprise system requires elevated performance, these systems generally manage fewer users who are able to access their data directly through the network. Moreover, enterprise system users are accessing, saving and sending comparatively relatively small files that require less storage capacity and performance.

Outside the internal enterprise network, however, it's a different story. Cloud systems are simultaneously being accessed by a multitude of users across the Internet, which itself becomes a performance bottleneck. The average cloud user stores relatively larger files than the average enterprise user placing greater strains on data center resources. The cloud provider's storage system not only has to scale to each user, but must also sustain performance across all users as well.

Best Practices
In response to growing storage demands, cloud providers are faced with profound business implications. Service providers need to scale quickly in order to meet the booming demand for more data storage. The following best practices can help optimize data center ROI in a period of significant IT cutbacks:

  • Opt for commodity components when possible: Low-energy hardware makes good business sense. Commodity hardware is not only cost-effective, but also energy-efficient, which significantly reduces both setup and operating costs in one move.
  • Seek out a distributed storage system: Distributed storage presents the best way to build at scale even though the data center trend has been moving toward centralization. Increased performance at the software level counterbalances the performance advantage of a centralized data storage approach.
  • Avoid bottlenecks: A single point of entry can easily lead to a performance bottleneck. Adding caches to relieve the bottleneck, as most data center infrastructures currently do, quickly adds cost and complexity to a system. On the other hand, a horizontally scalable system that distributes data among all nodes delivers a high level of redundancy.

Moving Forward
Currently, Big Data storage consists mainly of high performance, vertically scaled storage systems. Since these infrastructures can only scale to a single petabyte and are costly, they are not a sustainable solution. Moving to a horizontally scaled data storage model that distributes data evenly onto energy-efficient hardware can reduce costs and increase performance in the cloud. With these insights, cloud service providers can take the necessary steps to improve the efficiency, scalability and performance of their data storage centers.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're focused on how to get some of the attributes that you would expect from an Amazon, Azure, Google, and doing that on-prem. We believe today that you can actually get those types of things done with certain architectures available in the market today," explained Steve Conner, VP of Sales at Cloudistics, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprises are moving to the cloud faster than most of us in security expected. CIOs are going from 0 to 100 in cloud adoption and leaving security teams in the dust. Once cloud is part of an enterprise stack, it’s unclear who has responsibility for the protection of applications, services, and data. When cloud breaches occur, whether active compromise or a publicly accessible database, the blame must fall on both service providers and users. In his session at 21st Cloud Expo, Ben Johnson, C...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"The reason Tier 1 companies are coming to us is we're able to narrow the gap where custom applications need to be built. They provide a lot of services, like IBM has Watson, and they provide a lot of hardware but how do you bring it all together? Bringing it all together they have to build custom applications and that's the niche that we are able to help them with," explained Peter Jung, Product Leader at Pulzze Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2,...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...