Welcome!

News Feed Item

Fujitsu Develops Data-Transfer Acceleration Technology

Applicable to various types of network environments; data deduplication and compression enable up to 10 times improvement of throughput with only software

Tokyo, Apr 8, 2014 - (JCN Newswire) - Fujitsu Laboratories Ltd. today announced the development of a technology for accelerating data transfer speeds that can be applied to various types of network environments used for cloud, mobile applications and other services.

Companies are today increasingly positioning their business systems in the cloud and consolidating their servers into datacenters. This is driving rapid growth in data transfer volume between multiple datacenters as well as datacenters and end-user equipment. While specialized hardware or network upgrades could speed up transfers, these are expensive approaches and cannot be easily deployed in cloud, virtual or mobile environments.

Fujitsu Laboratories has now developed a technology that uses data deduplication and compression to reduce the volume of data transfers, enabling dramatically greater data transfer speeds with only software. This technology can increase effective data-transfer rates by 10 times without the added cost of network upgrades, and can boost transfer speeds in a wide range of network environments between cloud datacenters and end-user equipment.

This technology will be exhibited at Fujitsu Forum 2014, running May 15-16 at Tokyo International Forum in Tokyo.

Background

The popularity of cloud services and mobile devices is leading to ever-more applications available over networks that use a variety of communications environments. Every day, companies use datacenters for file transfers, file sharing, backups, and distributing data to mobile devices. Given that the volume of data transfers is expanding daily, there is a growing need to inexpensively accelerate data speeds without network upgrades or deployment of specialized equipment.

Issues

Previously, Fujitsu Laboratories conducted R&D on high-speed protocols as a way to accelerate data transfers. This is an effective method when a network's bandwidth is not being used to its full capacity. However, when bandwidth is fully utilized, solving problems such as network delays required network upgrades.

In order to accelerate transfers without upgrading networks, another effective approach is to reduce the volume of data being transferred. There are two techniques for doing this. One is data deduplication, where both sender and recipient save copies of data so that duplicated data transmission can be eliminated. The other is to compress data using file-compression technology. Data deduplication and compression technologies are already used in storage systems and specialized hardware for accelerating data transfers, and are effective in accelerating transfer rates. But implementing these technologies in software so that they can be used with mobile devices, for example, poses the following problems:

- Data deduplication relies on both sender and recipient to save copies of data, and this is difficult to implement on mobile devices that have limited storage capacity.
- Both deduplication and compression are processor-intensive, and on mobile devices that have limited CPU power, these techniques can slow down effective transfer rates.

About the Technology

Fujitsu has now developed high-speed data transfer technology that can effectively boost data transfer speeds even on mobile devices, using software to cut data volumes, leveraging data deduplication and compression technologies. Features of the technology are as follows.

1. High deduplication performance by minimizing storage requirement

Fujitsu Laboratories has developed a technology that selects data that appears with high statistical frequency on the network and prioritizes it for saving. This maintains high deduplication performance while reducing the volume of low-frequency data being saved, so that the limited memory on mobile devices can be used more effectively. In internal testing, this was found to reduce the volume of duplicate data saved in the mobile device's storage by as much as 80%.

2. Lightweight data compression technology

Fujitsu Laboratories has developed a technology that reduces the CPU load imposed by the compression process to as little as one-fourth the level of conventional technologies. This technology searches for repeating patterns that appear within the data. When no pattern is found, it searches at longer intervals. When a pattern is found, data in close proximity is searched in particular detail. This results in efficient searching overall, while greatly reducing the time required for compression.

3. Automatic selection technology for operational improvements

Deduplication and compression of transfer data will at times lower data transfer rates when the size of data to be transferred is small and duplication is rarely found on the data. The net effectiveness of these techniques depends on data size, available network bandwidth, and CPU power. This technology will therefore periodically evaluate these parameters, assess the acceleration benefits, and determine whether deduplication or compression will be effective in practice. This frees system administrators from needing to configure settings for every deployed environment and improves operations.

Results

This newly developed technology is expected to lead to easier use of a variety of network applications in the cloud, and in virtual and mobile environments that are anticipated to continue to become more prevalent. This technology being available as software means that it can be integrated into existing servers and operating systems to bring its benefits to notebook computers, tablets, smartphones, and other mobile devices.

In one possible scenario, a salesman visiting Company A would download sales materials for Company A from headquarters, and then when visiting Company B would download slightly different sales materials for Company B. The parts of these materials that are the same would not need to be re-downloaded, effectively increasing download speeds. Over a 10 Mbps connection of available bandwidth, this has been shown to increase effective transfer rates 10 times.

Future Plans

Fujitsu Laboratories will be conducting practical testing of this technology with the goal of commercializing during fiscal 2014. This will contribute to greater quality with everything from datacenters on wide-area networks to end users on mobile devices.

About Fujitsu Limited

Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Approximately 170,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE: 6702) reported consolidated revenues of 4.4 trillion yen (US$47 billion) for the fiscal year ended March 31, 2013 For more information, please see www.fujitsu.com.



Source: Fujitsu Limited

Contact:
Fujitsu Limited
Public and Investor Relations
www.fujitsu.com/global/news/contacts/
+81-3-3215-5259


Copyright 2014 JCN Newswire. All rights reserved. www.japancorp.net

More Stories By JCN Newswire

Copyright 2008 JCN Newswire. All rights reserved. Republication or redistribution of JCN Newswire content is expressly prohibited without the prior written consent of JCN Newswire. JCN Newswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Le...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Dhiraj Sehgal works in Delphix's product and solution organization. His focus has been DevOps, DataOps, private cloud and datacenters customers, technologies and products. He has wealth of experience in cloud focused and virtualized technologies ranging from compute, networking to storage. He has spoken at Cloud Expo for last 3 years now in New York and Santa Clara.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.