|By Gil Haberman||
|March 3, 2014 07:45 AM EST||
CIOs cited reducing enterprise costs; improving IT applications and infrastructure; improving efficiency; and improving business processes among the top 10 business priorities, according to a Gartner Executive Program Survey conducted last year.
To address these business priorities, many CIOs are rethinking how they architect their IT infrastructure to better deliver applications and services to end users. A study conducted by IDG Research Services in 2013 surveyed 333 IT directors or higher titles at enterprise companies with more than 500 employees and reported that 77% of IT leaders globally believe data center transformation will play a highly important role in delivering business goals to their organizations.
Instead of allowing remote and branch offices to maintain the hardware and data, many enterprise CIOs today are centralizing storage and backups to their core data centers. This makes sense for the sake of simplicity and ensures IT managers adopt a ‘hands-on' approach to device management, which mitigates the security risk and complexity of having multiple backup systems and dispersed data sets. In addition, organizations with offices in sometimes unstable or physically challenging locations may not want any locally stored data in those offices, due to security concerns and potential data risks.
The reality of recovering from a disaster, whether natural or human-induced can be a daunting task in a typical branch office set-up. Most often data protection solutions are tape-based and can take days to recover and leave an organization exposed to delay and data loss. A typical branch disaster recovery requires not only physical hardware replacement, but a rebuild and patching of the operating system, reinstallation of applications, virus scanning, and full-data recovery prior to returning to service. Organizations that rely on weekly full and daily incremental backups of branch data face the additional challenge of restoring from multiple tapes and the loss of new data created between the time of the outage and the last captured backup.
Centralizing storage and backups to core data centers also minimizes travel expenses since data experts are rarely at the edge to provide support and management. According to IDG Research, 37% of organizations utilize non-IT staff to manage backups at remote branch offices, and this number grows to 67% when the branch office is located outside the United States. Centralizing these services to core data centers puts the data back in the hands of IT experts, rather than non-IT staff filling a role they are not equipped to manage.
A significant driver for many organizations interested in storage consolidation, however, is cost. Companies want to maximize their investments in storage area networks (SANs) and realize the benefits of centralized storage. Chief among those benefits is achieving lower IT costs by eliminating the need to purchase and maintain local storage and server hardware. A new concept, which bridges this gap, is storage delivery technology.
CIOs are realizing that core business priorities can be addressed and solved, but they need to start by addressing core IT processes and transforming the data center.
As part of this movement, IT managers can expand storage capacity and look to extend the benefits of a consolidated approach to larger branch offices and data-intensive applications that previously were difficult or impossible to consolidate because of local performance requirements.
A Changing IT Landscape
Consolidation has the potential to empower businesses of all sizes to remove servers and data from branch offices and centralize them in the secure data center - without sacrificing user experience. This new architectural approach makes it possible to centralize backup operations and remove data from high-risk locations, while increasing agility and lowering the costs of managing remote office IT. High-risk locations can be any location that may be more susceptible to natural disasters, remote areas that are difficult to access, hacking, political turmoil, or even outright theft. To succeed in this dynamic environment, IT leaders need agility, security and control, while business users demand performance.
Identifying the optimal deployment location for IT assets such as servers and supporting storage systems is one of the more challenging aspects of the IT decision process today. When the edge of the enterprise and the core at the data center are linked together in an integrated solution, IT organizations can centralize control, security, and protection of distributed server and storage assets. This approach ensures timely access to (or recovery of) data and applications relied on by users across the extended organization while maximizing IT agility. Organizations can quickly adjust to changing conditions with the right information, delivered at the right time and in the right place to serve customers and partners better while keeping employees happy and productive.
Protection of Data
In-line with the growing volumes of data, enterprises should be able to decommission branch backup and recovery systems, shifting data protection operations to the secure data center. Enterprises are able to utilize their well-honed data center backup and recovery systems and procedures and skilled personnel to protect branch data.
Snapshots of a backup - a read-only copy of the data set at a point in time - are an integral part of ensuring your IT operations are running smoothly. In today's IT environment, administrators must be able to quickly set and assign hourly, daily, or weekly storage snapshot policies to ensure application-consistent data protection in conjunction with supported data center storage arrays.
Once storage snapshots are created in the data center, in addition to leveraging the disk snapshot for fast recovery, many organizations are replicating to a secondary data center or sending a copy to cloud storage environments to ensure data is located offsite.
Organizations' information bases and requirements are constantly changing in response to shifting customer demands and business requirements. Consolidation of systems to centralized data centers must make processes faster, less operationally intensive now and in the future. A well-thought-out IT architecture will make it easier to scale, making it simple to expand additional services to new user bases and locations. Finally, a solid IT architecture will make it easier for an enterprise to react quickly when disasters strike or as the needs of the organization change.
- "Gartner Executive Program Survey," Gartner, 2013
- "Riverbed Data Center Transformation Survey," IDG Research, 2013
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Oct. 24, 2016 08:15 AM EDT Reads: 3,140
SYS-CON Events announced today that Embotics, the cloud automation company, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Embotics is the cloud automation company for IT organizations and service providers that need to improve provisioning or enable self-service capabilities. With a relentless focus on delivering a premier user experience and unmatched customer support, Embotics is the fas...
Oct. 24, 2016 08:00 AM EDT Reads: 830
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
Oct. 24, 2016 07:30 AM EDT Reads: 16,429
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Oct. 24, 2016 07:30 AM EDT Reads: 2,543
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Oct. 24, 2016 07:15 AM EDT Reads: 915
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 24, 2016 05:45 AM EDT Reads: 11,374
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Oct. 24, 2016 05:15 AM EDT Reads: 2,516
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, discussed how the ability to access and analyze the massive volume of streaming data from millio...
Oct. 24, 2016 05:00 AM EDT Reads: 3,095
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Oct. 24, 2016 05:00 AM EDT Reads: 5,535
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
Oct. 24, 2016 05:00 AM EDT Reads: 2,499
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Oct. 24, 2016 05:00 AM EDT Reads: 861
SYS-CON Events announced today that Interface Masters Technologies, a leader in Network Visibility and Uptime Solutions, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Interface Masters Technologies is a leading vendor in the network monitoring and high speed networking markets. Based in the heart of Silicon Valley, Interface Masters' expertise lies in Gigabit, 10 Gigabit and 40 Gigabit Eth...
Oct. 24, 2016 04:45 AM EDT Reads: 3,327
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Oct. 24, 2016 04:30 AM EDT Reads: 2,504
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, will draw together recent research and lessons learned from emerging and established ...
Oct. 24, 2016 04:30 AM EDT Reads: 1,319
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 24, 2016 04:00 AM EDT Reads: 1,721