Welcome!

News Feed Item

Load DynamiX Simplifies Storage Performance Workload Modeling to Accelerate Flash and Hybrid Storage System Adoption

Load DynamiX, the leader in storage infrastructure performance validation, today announced its latest storage workload modeling products that enable enterprise storage architects and engineers accelerate deployment of solid state technologies, validate performance, and optimize storage budgets.

Key offerings include:

  • An all-in-one two rack unit (2RU) appliance that can emulate tens of thousands of clients across all SAN and NAS environments.
  • The first-ever VDI application workload model that looks at application bottlenecks from the storage perspective.
  • The industry’s first and only application modeling of data compression and inline deduplication which is key to investment decision-making for Flash and hybrid storage systems.

“Our new products represent two key initiatives: lowering the cost and reducing the complexity of deploying storage performance validation solutions, while greatly enhancing the value of storage workload modeling,” stated Philippe Vincent, CEO at Load DynamiX. “These breakthroughs will help storage vendors and architects dramatically improve their performance testing and change validation processes – especially around the adoption of flash and hybrid storage products.”

New Appliances

To cost-effectively stress test enterprise SAN, NAS, and Object storage systems, the new LDX Enterprise Series solution combines the Load DynamiX Enterprise workload modeling application with a load generation appliance containing both 10GbE and FC testing interface ports in a single 2RU enclosure.

This new appliance complements the recently introduced high density variants of its award-winning FC (16G), and Ethernet (10GE and 1GE) appliances to help customers stress their largest storage systems that now include unified appliances for SAN and NAS environments.

A new virtual appliance is available as a software-only version of the Load Dynamix load generation appliance. Aimed at technology vendor engineering organizations, it allows development engineers duplicate proven approaches already used by their QA organizations. Vendors can now test earlier in their development process and accelerate time to market while improving product quality.

Simplified Storage Workload Modeling

Load DynamiX continues to deliver on its strategy to build the industry’s most complete storage workload modeling solution that simulates real-world applications. Enhancements include a simpler user interface, improved analytics and flexible charting capabilities for easier product comparisons. New reporting templates will greatly facilitate typical use cases, such as performance capacity planning and head-to-head product comparisons.

A new VDI application workload model is specifically designed to help users evaluate storage systems for VDI deployments. By creating a large number of VDI Linked Clones and measuring VDI boot storm performance, storage architects can see how well the underlying storage infrastructure will perform under varying user and workload conditions. Armed with this information, storage architects can make intelligent decisions that eliminate both under- and over-provisioning of storage.

Granular workload models for NFSv3, SMB 2, iSCSI and Fibre Channel have also been added. The new models enable a very high degree of customization and detail around storage access patterns, block/file sizes and distributions, directory structures, and load properties.

Modeling of Compression & Deduplication

Load DynamiX is now the first and only storage performance validation solution that can model data compression and inline deduplication -- key issues storage architects must quantify properly to assess the ROI of adopting flash or hybrid storage. Both technologies substantially reduce the price per GB and are critical to flash and hybrid storage adoption.

“Although storage systems that incorporate flash promise to relieve all storage performance problems, determining which applications justify the need for flash and exactly how much flash to deploy are fundamental questions,” said Jeff Boles, director-Lab Validation Services at Taneja Group. “Workload modeling and performance validation solutions, such as Load DynamiX, will help storage architects and engineers obtain predictive insight into storage infrastructure behavior for performance assurance and cost optimization.”

Load DynamiX will be exhibiting their products at Data Storage Innovations Conference in Santa Clara, CA, April 22-24, 2014 and at EMC World 2014 in Las Vegas May 5-8 in booth #642.

About Load DynamiX

As the leader in storage infrastructure performance validation, Load DynamiX empowers IT professionals with the insight needed to make intelligent decisions regarding networked storage. By accurately characterizing and emulating real-world application behavior, Load DynamiX optimizes the overall performance, availability, and cost of storage infrastructure. The combination of advanced workload analytics and modeling software with extreme load-generating appliances give IT professionals the ability to cost-effectively validate and stress today’s most complex physical, virtual and cloud infrastructure to its limits. Visit www.loaddynamix.com for more information.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Whether you like it or not, DevOps is on track for a remarkable alliance with security. The SEC didn’t approve the merger. And your boss hasn’t heard anything about it. Yet, this unruly triumvirate will soon dominate and deliver DevSecOps faster, cheaper, better, and on an unprecedented scale. In his session at DevOps Summit, Frank Bunger, VP of Customer Success at ScriptRock, discussed how this cathartic moment will propel the DevOps movement from such stuff as dreams are made on to a practic...