Welcome!

News Feed Item

Gigaom Research: Benefits and Challenges of Software Defined Power

Power Assure Software Defined Power Solution Moves Substantial Live Loads Across Multiple Data Centers With No Impact on Service Levels

SANTA CLARA, CA -- (Marketwired) -- 05/06/14 -- Power Assure®, Inc., the leading Software Defined Power solutions provider for data centers, today announced that Gigaom Research has issued an underwritten research report titled, "Benefits and Challenges of Software Defined Power," authored by Analyst David S. Linthicum, which identifies Software Defined Power (SDP) as an evolving technology that will become commonplace in just a few years and also noted that Software Defined Power, because of its ROI and reliability benefits, can pay for itself in a short amount of time.

"As this technology emerges, chances are high that it will find its way into most data centers, becoming table stakes for running an effective and efficient data center operation. Consider the value of avoiding service outages as well as the ability to move to a lower cost of operations," said Linthicum in his report, adding "The use of this technology is destined to change the way we consider power management. We can finally couple power systems and the applications they power to better drive efficiency. While this will be an evolving science, even the initial value this technology brings to the industry is substantial."

Software Defined Power is an emerging technology that helps manage and move IT loads between datacenters to where power reliability, availability or cost is optimal, including for datacenters that are part of a public cloud service. Software Defined Power continuously tests disaster recovery procedures, making them substantially more reliable, at the same time matching on-line resources to application demand both within and across multiple sites with fail-safe verification.

The report further explores the fundamentals of Software Defined Power and reviews how IT organizations can apply SDP in their data centers to increase the availability of applications and data. The report cites that innovative companies are already leveraging Software Defined Power to increase reliability and reduce costs and to that end, it proposes that enterprises should approach this technology with a certain amount of planning to optimize success.

Analyst David Linthicum also points out that as part of his research for this paper, Power Assure demonstrated Software Defined Power using its Software Defined Power technology. As detailed in the report, the following was noted in the demonstration:

  • A transactional web application operating across four geographically dispersed data centers was demonstrated. This depicted four separate data centers with separate power systems to support each data center.

  • During the demonstration, the Power Assure Software Defined Power system responded automatically to changes in application demand. It did so without any impact on users, even when the rate of change was pronounced during the demonstration.

  • It was noted that the operation was fail-safe, with application resources being thoroughly tested before they had load placed on them.

  • During the demonstration we observed that the Power Assure software defined power system was able to proactively move substantial live loads between data centers 2,500 miles apart with no impact on connected users or service levels.

  • The purpose of relocating loads dynamically was to take advantage of the lowest-cost pricing for each data center. The load is redirected dynamically, based on the power requirements as well as the changing cost of power. All of this was done without disrupting the application services.

  • This same approach may also be leveraged for dynamic failover scenarios in support of disaster recovery operations.

The report concluded by highlighting the increasing importance of managing power based on application workload and the clear advantage of Software Defined Power with its ability to deliver a quick ROI.

Read the full report here: Gigaom Research: Benefits and challenges of Software Defined Power.

About Power Assure (www.powerassure.com)
Power Assure is the leading developer of software-defined power solutions for large enterprises, government agencies, and managed service providers. Power Assure's solutions provide visibility, intelligence, analytics and automation to help CIOs, IT directors, and facilities managers optimize capacity, service levels, and power consumption within and across data centers. Headquartered in Santa Clara, CA, the company is privately held with funding from ABB Technology Ventures, Dominion Energy Technologies, Draper Fisher Jurvetson, Good Energies, Point Judith Capital, and a grant from the Department of Energy. Power Assure partners include ABB, Cisco, Dell, IBM, In-Q-Tel, PARC, Raritan, UL and VMware.

Power Assure is a registered trademark of Power Assure, Inc. All product names and references remain the trademarks or registered trademarks of their respective owners.

Add to Digg Bookmark with del.icio.us Add to Newsvine

Contact:
Beth Winkowski
Winkowski Public Relations, LLC for Power Assure
Phone: 978-649-7189
Email: Email Contact

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership abi...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists l...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., will discuss how these tools can be leveraged to develop a lasting competitive advanta...
TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets. By creating abundant, high-quality editorial content across more than 140 highly targeted technology-specific websites, TechTarget attracts and nurtures communities of technology buyers researching their companies' information technology needs. By understanding these buyers' content consumption behaviors, TechTarget creates the purchase inte...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, will discuss some of the security challenges of the IoT infrastructure and relate how these aspects impact Smart Living. The material will be delivered i...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), will provide an overview of various initiatives to certifiy the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldw...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...