|By JCN Newswire||
|January 29, 2013 12:25 AM EST||
- Reduces virtual desktop operating latency to less than 1/6 of previous levels
Kawasaki, Japan, Jan 29, 2013 - (JCN Newswire) - Fujitsu Laboratories Limited today announced the development of a new data transfer protocol that, by taking a software-only approach, can significantly improve the performance of file transfers, virtual desktops and other various communications applications.
Conventionally, when using transmission control protocol (TCP)(1) - the standard protocol employed in communications applications - in a low-quality communications environment, such as when connected to a wireless network or during times of line congestion, data loss (packet loss) can occur, leading to significant drops in transmission performance due to increased latency from having to retransmit data.
To address this problem, Fujitsu Laboratories has succeeded at a software-only approach, developing: 1) A new protocol that incorporates an efficient proprietarily developed retransmission method based on user datagram protocol (UDP)(2), an optimized way to deliver streaming media able to reduce latency resulting from data retransmission when packet loss occurs; 2) Control technology that addresses the problem of UDP transmissions consuming excess bandwidth by performing a real-time measurement of available network bandwidth and securing an optimal amount of communications bandwidth without overwhelming TCP's share of the bandwidth; and 3) Technology that, by employing the new protocol, makes it possible to easily speed up existing TCP applications without having to modify them.
Through a simple software installation, the new technology will make it possible to speed up TCP applications that previously required costly specialized hardware, and it can also be easily incorporated into mobile devices and other kinds of equipment. Moreover, compared with TCP, the technology enables a greater than 30 times improvement in file transfer speeds between Japan and the US, in addition to reducing virtual desktop operating latency to less than 1/6 of previous levels. This, in turn, is expected to make it easier to take advantage of various applications employing international communication lines and wireless networks which are anticipated to become increasingly widespread.
With the increased popularity of mobile devices and cloud services in recent years, a wide range of applications have begun to utilize communications capabilities. In many applications, such as file transfer, virtual desktop, and other communications applications, TCP is employed as a standard communications protocol. One issue with TCP is that data loss (packet loss) can occur in low-quality communications environments, resulting in significant drops in transmission performance (reduced throughput and higher latency) due to increased latency from having to retransmit data. In the future, it is expected that there will be greater opportunities to take advantage of international communications lines and wireless networks, making it necessary to ensure that transmission performance does not drop even when connected to a low-quality communications environment.
Currently, one well-known method of speeding up application transmission speeds in low-quality communications environments is to employ specialized acceleration hardware. This kind of specialized equipment, however, is expensive and bulky, making it difficult to incorporate into mobile devices. High-speed transmission methods for transferring files using software-based acceleration also exist, but to support a variety of existing TCP applications using these methods, it has been necessary to make modifications to the traffic processing components of each application.
Newly Developed Technology
By developing a proprietary software-based transfer protocol, Fujitsu Laboratories has succeeded in significantly improving the throughput and operating latency of existing TCP applications.
Key features of the new technology are as follows:
1) New protocol improves throughput and latency in low-quality communications environments
Fujitsu has developed a new protocol that incorporates a proprietarily developed and efficient retransmission method based on UDP, a protocol optimized for delivering streaming media. As a result, the new protocol is able to reduce latency resulting from data retransmission when packet loss occurs. The protocol can quickly distinguish between lost packets and packets that have not yet arrived at their destination, thereby preventing unnecessary retransmissions and latency from occurring. By incorporating the new protocol as a software add-on to UDP, it is possible to maintain the high speeds typical of UDP while avoiding packet loss and packets being sent in reverse order, UDP's main weaknesses. This, in turn, has enabled improvements in packet delivery and latency. In a comparison with standard TCP, the new protocol achieved a throughput increase of over 30 times during a simulated file transfer between Japan and the US, and operating packet delivery latency was reduced to less than 1/6 of previous levels.
Fujitsu Laboratories developed a control technology that, by performing real-time measurement of available network bandwidth, can secure an optimal amount of communications bandwidth without overwhelming the share of bandwidth used by other TCP communications in a mixed TCP environment. For example, when other TCP communications are using relatively little bandwidth, the bandwidth share for the new protocol will increase, and when other TCP communications are taking up a higher percentage of bandwidth, the new protocol will use a smaller share.
3) Technology for accelerating existing TCP applications without any modifications
Fujitsu Laboratories has developed a technology that automatically converts TCP traffic standard for a wide variety of applications into the new protocol described in (1) above. This makes it possible to significantly improve the speed of a host of existing applications, including file transfer applications, virtual desktop applications, and web browsing applications, all without the need for any modifications.
The use of the new technology is expected to speed up the performance of a wide range of communications applications employing international communication lines and wireless networks which are anticipated to become widely used more and more. For instance, the technology can help speed up web browsing and file download speeds in mobile communications environments where there is deterioration due to building obstructions or movement. In addition, the technology can improve data transfer speeds between datacenters in Japan and the US. It is also expected to help improve the usability of virtual desktops when accessing a virtual desktop located on a remote server using a low-quality communications environment.
During fiscal 2013, Fujitsu Laboratories aims to commercialize the new technology as a communications middleware solution for improving communications speeds without having to modify existing TCP applications.
(1) Transmission Control Protocol (TCP): An Internet protocol that guarantees data delivery through a retransmission mechanism.
(2) User Datagram Protocol (UDP): An Internet protocol that does not guarantee data delivery.
About Fujitsu Limited
Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Over 170,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE:6702) reported consolidated revenues of 4.5 trillion yen (US$54 billion) for the fiscal year ended March 31, 2012. For more information, please see www.fujitsu.com.
Source: Fujitsu Limited
Fujitsu Limited Public and Investor Relations www.fujitsu.com/global/news/contacts/ +81-3-3215-5259
Copyright 2013 JCN Newswire. All rights reserved. www.japancorp.net
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
Oct. 21, 2016 05:45 AM EDT Reads: 5,027
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Oct. 21, 2016 05:00 AM EDT Reads: 3,903
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Oct. 21, 2016 04:45 AM EDT Reads: 389
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Oct. 21, 2016 04:30 AM EDT Reads: 3,063
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
Oct. 21, 2016 04:30 AM EDT Reads: 16,194
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Oct. 21, 2016 04:15 AM EDT Reads: 1,700
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Oct. 21, 2016 04:00 AM EDT Reads: 10,925
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Oct. 21, 2016 03:30 AM EDT Reads: 3,727
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Oct. 21, 2016 03:15 AM EDT Reads: 3,834
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 21, 2016 03:15 AM EDT Reads: 1,642
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will show how customers are able to achieve a level of transparency that enables everyon...
Oct. 21, 2016 02:30 AM EDT Reads: 1,204
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Oct. 21, 2016 02:00 AM EDT Reads: 5,897
SYS-CON Events announced today that Interface Masters Technologies, a leader in Network Visibility and Uptime Solutions, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Interface Masters Technologies is a leading vendor in the network monitoring and high speed networking markets. Based in the heart of Silicon Valley, Interface Masters' expertise lies in Gigabit, 10 Gigabit and 40 Gigabit Eth...
Oct. 21, 2016 01:45 AM EDT Reads: 3,259
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Oct. 21, 2016 01:30 AM EDT Reads: 863
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
Oct. 21, 2016 01:30 AM EDT Reads: 1,346