Welcome!

Related Topics: SDN Journal, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo

SDN Journal: Article

SolidFire Delivers True Storage Agility to the Next Generation Data Center

Element OS Version 6 Provides the Most Complete Range of Enterprise Class Features in Any All-Flash Array

SolidFire has introduced Version 6 of its Element OS, named Carbon, and a new set of enterprise class features into its all-flash array. Building on tremendous success as the benchmark storage architecture for large-scale cloud service providers, SolidFire is rolling out new features that pave the way for enterprises striving to deliver a more agile, automated, and scalable storage infrastructure. New functionality will be generally available in Q2 2014.

The increasing pressures on Enterprise IT
The era of cloud computing has dramatically raised the bar on both the speed and cost at which Enterprise IT services are delivered and consumed. This radical shift in expectations has become a driving force behind the transformation of enterprise data centers worldwide.

"Storage is at the core of the next generation data center," commented Dave Wright, SolidFire Founder and CEO, "And neither traditional disk systems nor today's basic all-flash arrays are supporting this transformation in resource allocation and management. Our customers expect great performance from us, but they also expect us to support their broader business objectives to deliver internal storage services that are more agile, scalable, automated, and predictable than ever before."

Bringing storage agility to the Next Generation Data Center
SolidFire recently previewed the features of Element OS 6, and introduced key customers Internap, SunGard and ServInt, before an audience of more than 35 industry analysts and influencers from around the world at the company's first Analyst Day in Boulder, Colorado.

"Solidfire attacks what to me is the most glaring missing element in tomorrow's enterprise data center -- Quality of Service," commented Steve Duplessie, founder and senior analyst of the Enterprise Strategy Group. "As more and more applications are delivered from shared storage infrastructure -- performance predictability and scale have become paramount. That's been the problem with traditional storage architectures in the modern era of infrastructure virtualization."

With this release, SolidFire is introducing a combination of unique features that smooth the enterprise transition to Next Generation Data Center technologies. These new features include:

Introduction of Fibre Channel Connectivity: Adding to their 10Gb iSCSI connectivity, SolidFire introduces 16Gb active / active Fibre Channel (FC) connectivity to its full line of all-flash arrays -- SF3010, SF6010, and SF9010. This added functionality enables enterprise customers to easily transition current FC workloads and take advantage of SolidFire's guaranteed storage performance, system automation, and scale-out architecture.

Real-Time Replication: SolidFire's Real-Time Replication technology enables the quick and cost-effective creation of additional remote copies of data. Native to the SolidFire design, this functionality delivers essential disaster recovery capabilities to CSP and enterprise customers without the need for third party hardware or software. The SolidFire replication model is extremely flexible, each cluster can be paired with up to four other clusters and replicate data in either direction allowing for easy failover and failback.

Mixed-Node Cluster Support: SolidFire storage systems now support the combination of storage nodes of different capacity, performance, and protocols within a single cluster. Within every SolidFire storage system, capacity and performance are managed as two global and separate resource pools. When new storage nodes are added to a cluster, additional capacity and performance are made immediately available to both existing applications and new workloads. Additionally, Mixed-Node Cluster Support allows enterprise customers to continually leverage the economics of the most current flash technology in the market while providing long term investment protection.

"With mixed node support, SolidFire has eradicated the concept of 'generational' or 'forklift upgrades' common with traditional disk and other all-flash storage systems," discussed Matt Loschert, CTO of managed hosting provider ServInt. "As we scale our storage infrastructure we simply add the most current SolidFire platform without downtime or impact to our hosted customers -- resources are instantly available. Decommissioning systems is as simple as adding them. We can take them off line without compromising availability or any of the Quality of Service (QoS) settings that we have established with our customers."

Integrated Backup & Restore: This unique SolidFire functionality provides native snapshot-based backup and restore functionality compatible with any object store or device that has an S3 or SWIFT compatible API. This first-of-its-kind functionality eliminates the cost and complexity of third party backup & recovery products, while dramatically accelerating backup performance. CSP and Enterprise customers can now effortlessly scale backups for thousands of hosts and applications.

For more information on the SolidFire's all-flash storage systems, including this new release, please see http://www.solidfire.com or schedule a live demo today.

More Stories By Liz McMillan

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Latest Stories
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
SYS-CON Events announced today that Silicon India has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Published in Silicon Valley, Silicon India magazine is the premiere platform for CIOs to discuss their innovative enterprise solutions and allows IT vendors to learn about new solutions that can help grow their business.
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...