Click here to close now.




















Welcome!

News Feed Item

Fujitsu Develops Compact, High-Performance and Energy-Efficient DSP for Mobile Device Baseband Processing

Leverages supercomputer technology to lower energy requirements for smartphones and other wireless mobile devices

Kawasaki, Japan, Feb 1, 2013 - (JCN Newswire) - Fujitsu Laboratories Limited today announced the development of a digital signal processor (DSP) for use in mobile device baseband processing. By employing a vector processing architecture(1) as used in supercomputers, the DSP can efficiently run highly repetitive processes common in LTE(2) and other wireless processes. This, in turn, allows for greater energy efficiency.

By using 28-nanometer (nm) process technology and when running at 250 MHz, DSP is capable of processing data at 12 GOPS (12 billion operations per second). Excluding memory, the DSP measures only 0.4 mm2, and it consumes only 30 milliwatts (mW) of power, 20% less than existing DSPs.

The new DSP is expected to help lengthen talk times, usage times and standby times for smartphones and other mobile devices. In addition, revisions to the signal processing algorithm can be implemented through modifications to the DSP program, enabling fine-tuning of reception characteristics after the wireless baseband LSI has been manufactured, thereby contributing to shorter development lead times.

Details of the new technology will be presented at the 2013 International Symposium on VLSI Design, Automation and Test (2013 VLSI-DAT), scheduled to open on April 22 in Taiwan.

Background

In recent years, smartphones, tablets, and other wireless devices have rapidly gained in popularity. As the speed of wireless networks increase, manufacturers have launched models that support new wireless communications standards such as LTE, which is currently being rolled out worldwide, in addition to conventional standards such as GSM and 3G. To support these wireless standards, a signal processing circuit (baseband processor) compliant with each computation is required. As a result, being able to reduce the size and power consumption of baseband processor components is crucial to improving the cost and battery life of devices.

Technological Challenges

Typical baseband processing circuits are implemented using specialized hardware to support each communications standard, requiring a number of circuits to support different standards. Consequently, reducing the size of circuits has proved challenging. On the other hand, there exist alternative software-based approaches to supporting each communications standards using DSPs. Given the massive signal processing requirements of LTE, however, these approaches are limited in their ability to simultaneously achieve both high processing performance and low energy consumption.

Newly Developed Technology

Fujitsu Laboratories has developed a new DSP that employs a vector processing architecture used in supercomputers. The DSP can efficiently run highly repetitive processes that are common in LTE and other wireless processes, thereby achieving greater energy efficiency.

Key features of the newly developed DSP are as follows:

1) Vector processing architecture

The DSP employs a vector architecture found in supercomputers. With a typical processor, a single instruction will be executed on a single piece of data at a time (scalar data). By contrast, a vector processor will execute a single instruction on multiple pieces of data (vector data) at a time. As a result, when repeating the same process for multiple data elements, the ability of the vector architecture to complete a task with a single instruction makes it possible to cut down on the amount of processing and energy required to read and decode instructions from the memory.

LTE uses the OFDM(3) modulation method and communicates by bundling data that is carried by up to 1,200 "subcarriers" in a wireless signal. To extract information from an incoming signal, the DSP must apply the same process repeatedly for each subcarrier. This makes the vector approach more effective.

Figure 1 shows a block diagram of the newly developed DSP. The DSP consists of a vector engine, which employs a vector architecture, as well as a conventional CPU. The CPU reads in program code from the instruction memory, decodes the instruction, and if the vector approach is suitable for use on the instruction, it is transferred to the vector engine where it is executed. Other instructions are executed in the CPU as usual.

2) Vector engine optimized for baseband processing

Figure 2 shows an internal diagram of the vector engine itself. Instructions transferred from the CPU are stored in the instruction buffer. Stored instructions are decoded one by one by the sequencer, and the required vector processing pipeline(4) is controlled to execute the instruction. The number of vector data (vector length) that can be calculated in a single instruction is 64 data elements, a value optimized for use in mobile device baseband processing. Rather than processing 64 data elements sequentially, eight parallel processing elements process the data in eight rounds, thereby achieving higher speeds.

The vector engine features a small circuit size, and to increase the efficiency of baseband processing, there are two pipelines for processing multiply instructions on vector data, and there are also two pipelines for processing load instructions (or conversely, store instructions) that load vector data to the register file, which temporarily stores data from memory, for a total of four pipelines. All four pipelines can process addition, subtraction, and logic operations on vector data.

Results

With the addition of the newly developed vector engine, many pieces of data can be processed with a single instruction, thereby enabling more efficient data processing. This, in turn, will significantly contribute to reduced energy consumption in wireless baseband LSIs. A DSP using 28nm process technology and running at 250 MHz is able to process 12 GOPS (12 billion operations per second). Fujitsu Laboratories succeeded in developing a compact DSP that measures only 0.4 mm2 (without memory) and, in terms of power consumption, requires only 30mW - a 20% improvement over existing DSPs.

The new DSP is expected to help lengthen talk times, usage times and standby times for smartphones and other mobile phones. In addition, revisions to the signal processing algorithm can be implemented through modifications to the DSP program, enabling fine-tuning of reception characteristics after the wireless baseband LSI has been manufactured, thereby contributing to shorter development lead times.

Future Development

The new DSP will be incorporated into a communications processor from Access Network Technology Limited that is scheduled for use in Fujitsu smartphones and elsewhere. Going forward, Fujitsu Laboratories plans to continue making performance improvements to the processor to enable it to keep pace with advances in higher speed wireless communications standards.

(1) Vector processing architecture: A processor architecture for processing calculations on vector data (a one-dimensional array of data) with a single instruction.
(2) Long Term Evolution (LTE): The name of the latest mobile communications standard for wireless devices.
(3) Orthogonal Frequency-Division Multiplexing (OFDM): A wireless modulation encoding used also in wireless LANs.
(4) Processing pipeline: A circuit that executes an arithmetic process according to an instruction. By dividing execution into multiple stages, the calculation is executed according to a workflow process.

For further details with diagrams, please visit www.fujitsu.com/global/news/pr/archives/month/2013/20130201-04.html.

About Fujitsu Limited

Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Over 170,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE:6702) reported consolidated revenues of 4.5 trillion yen (US$54 billion) for the fiscal year ended March 31, 2012. For more information, please see www.fujitsu.com.



Source: Fujitsu Limited

Contact:
Fujitsu Limited
Public and Investor Relations
www.fujitsu.com/global/news/contacts/
+81-3-3215-5259


Copyright 2013 JCN Newswire. All rights reserved. www.japancorp.net

More Stories By JCN Newswire

Copyright 2008 JCN Newswire. All rights reserved. Republication or redistribution of JCN Newswire content is expressly prohibited without the prior written consent of JCN Newswire. JCN Newswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
In their session at 17th Cloud Expo, Hal Schwartz, CEO of Secure Infrastructure & Services (SIAS), and Chuck Paolillo, CTO of Secure Infrastructure & Services (SIAS), provide a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. In his role as CEO of Secure Infrastructure & Services (SIAS), Hal Schwartz provides leadership and direction for the company.
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, analyzed a range of cloud offerings (IaaS, PaaS, SaaS) and discussed the benefits/challenges of migrating to each offe...
SYS-CON Events announced today that the "Second Containers & Microservices Expo" will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 17th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships at Com...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Scrum Alliance has announced the release of its 2015 State of Scrum Report. Almost 5,000 individuals and companies worldwide participated in this year's survey. Most organizations in the market today are still leading and managing under an Industrial Age model. Not only is the speed of change growing exponentially, Agile and Scrum frameworks are showing companies how to draw on the full talents and capabilities of those doing the work in order to continue innovating for success.
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobi...
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at @DevOpsSummit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, presented a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mocku...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Graylog, Inc., has added the capability to collect, centralize and analyze application container logs from within Docker. The Graylog logging driver for Docker addresses the challenges of extracting intelligence from within Docker containers, where most workloads are dynamic and log data is not persisted or stored. Using Graylog, DevOps and IT Ops teams can pinpoint the root cause of problems to deliver new applications faster and minimize downtime.
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacent...
Learn how you can use the CoSN SEND II Decision Tree for Education Technology to make sure that your K–12 technology initiatives create a more engaging learning experience that empowers students, teachers, and administrators alike.
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.