Welcome!

News Feed Item

Load DynamiX Simplifies Storage Performance Workload Modeling to Accelerate Flash and Hybrid Storage System Adoption

Load DynamiX, the leader in storage infrastructure performance validation, today announced its latest storage workload modeling products that enable enterprise storage architects and engineers accelerate deployment of solid state technologies, validate performance, and optimize storage budgets.

Key offerings include:

  • An all-in-one two rack unit (2RU) appliance that can emulate tens of thousands of clients across all SAN and NAS environments.
  • The first-ever VDI application workload model that looks at application bottlenecks from the storage perspective.
  • The industry’s first and only application modeling of data compression and inline deduplication which is key to investment decision-making for Flash and hybrid storage systems.

“Our new products represent two key initiatives: lowering the cost and reducing the complexity of deploying storage performance validation solutions, while greatly enhancing the value of storage workload modeling,” stated Philippe Vincent, CEO at Load DynamiX. “These breakthroughs will help storage vendors and architects dramatically improve their performance testing and change validation processes – especially around the adoption of flash and hybrid storage products.”

New Appliances

To cost-effectively stress test enterprise SAN, NAS, and Object storage systems, the new LDX Enterprise Series solution combines the Load DynamiX Enterprise workload modeling application with a load generation appliance containing both 10GbE and FC testing interface ports in a single 2RU enclosure.

This new appliance complements the recently introduced high density variants of its award-winning FC (16G), and Ethernet (10GE and 1GE) appliances to help customers stress their largest storage systems that now include unified appliances for SAN and NAS environments.

A new virtual appliance is available as a software-only version of the Load Dynamix load generation appliance. Aimed at technology vendor engineering organizations, it allows development engineers duplicate proven approaches already used by their QA organizations. Vendors can now test earlier in their development process and accelerate time to market while improving product quality.

Simplified Storage Workload Modeling

Load DynamiX continues to deliver on its strategy to build the industry’s most complete storage workload modeling solution that simulates real-world applications. Enhancements include a simpler user interface, improved analytics and flexible charting capabilities for easier product comparisons. New reporting templates will greatly facilitate typical use cases, such as performance capacity planning and head-to-head product comparisons.

A new VDI application workload model is specifically designed to help users evaluate storage systems for VDI deployments. By creating a large number of VDI Linked Clones and measuring VDI boot storm performance, storage architects can see how well the underlying storage infrastructure will perform under varying user and workload conditions. Armed with this information, storage architects can make intelligent decisions that eliminate both under- and over-provisioning of storage.

Granular workload models for NFSv3, SMB 2, iSCSI and Fibre Channel have also been added. The new models enable a very high degree of customization and detail around storage access patterns, block/file sizes and distributions, directory structures, and load properties.

Modeling of Compression & Deduplication

Load DynamiX is now the first and only storage performance validation solution that can model data compression and inline deduplication -- key issues storage architects must quantify properly to assess the ROI of adopting flash or hybrid storage. Both technologies substantially reduce the price per GB and are critical to flash and hybrid storage adoption.

“Although storage systems that incorporate flash promise to relieve all storage performance problems, determining which applications justify the need for flash and exactly how much flash to deploy are fundamental questions,” said Jeff Boles, director-Lab Validation Services at Taneja Group. “Workload modeling and performance validation solutions, such as Load DynamiX, will help storage architects and engineers obtain predictive insight into storage infrastructure behavior for performance assurance and cost optimization.”

Load DynamiX will be exhibiting their products at Data Storage Innovations Conference in Santa Clara, CA, April 22-24, 2014 and at EMC World 2014 in Las Vegas May 5-8 in booth #642.

About Load DynamiX

As the leader in storage infrastructure performance validation, Load DynamiX empowers IT professionals with the insight needed to make intelligent decisions regarding networked storage. By accurately characterizing and emulating real-world application behavior, Load DynamiX optimizes the overall performance, availability, and cost of storage infrastructure. The combination of advanced workload analytics and modeling software with extreme load-generating appliances give IT professionals the ability to cost-effectively validate and stress today’s most complex physical, virtual and cloud infrastructure to its limits. Visit www.loaddynamix.com for more information.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
In his session at @ThingsExpo, Arvind Radhakrishnen discussed how IoT offers new business models in banking and financial services organizations with the capability to revolutionize products, payments, channels, business processes and asset management built on strong architectural foundation. The following topics were covered: How IoT stands to impact various business parameters including customer experience, cost and risk management within BFS organizations.
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Here are the Top 20 Twitter Influencers of the month as determined by the Kcore algorithm, in a range of current topics of interest from #IoT to #DeepLearning. To run a real-time search of a given term in our website and see the current top influencers, click on the topic name. Among the top 20 IoT influencers, ThingsEXPO ranked #14 and CloudEXPO ranked #17.
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
Given the popularity of the containers, further investment in the telco/cable industry is needed to transition existing VM-based solutions to containerized cloud native deployments. The networking architecture of the solution isolates the network traffic into different network planes (e.g., management, control, and media). This naturally makes support for multiple interfaces in container orchestration engines an indispensable requirement.
In their session at @DevOpsSummit at 21st Cloud Expo, Michael Berman, VP Engineering at TidalScale, and Ivo Jimenez, Engineer at TidalScale, will describe how automating tests in TidalScale is easy thanks to WaveRunner. They will show how they use WaveRunner, Jenkins, and Docker to have agile delivery of TidalScale. Michael Berman is VP Engineering at TidalScale. TidalScale is developing a scale up compute and resource architecture for customers to perform big data exploration and real time anal...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...