|By JCN Newswire||
|February 1, 2013 01:44 AM EST||
Kawasaki, Japan, Feb 1, 2013 - (JCN Newswire) - Fujitsu Laboratories Limited today announced the development of a digital signal processor (DSP) for use in mobile device baseband processing. By employing a vector processing architecture(1) as used in supercomputers, the DSP can efficiently run highly repetitive processes common in LTE(2) and other wireless processes. This, in turn, allows for greater energy efficiency.
By using 28-nanometer (nm) process technology and when running at 250 MHz, DSP is capable of processing data at 12 GOPS (12 billion operations per second). Excluding memory, the DSP measures only 0.4 mm2, and it consumes only 30 milliwatts (mW) of power, 20% less than existing DSPs.
The new DSP is expected to help lengthen talk times, usage times and standby times for smartphones and other mobile devices. In addition, revisions to the signal processing algorithm can be implemented through modifications to the DSP program, enabling fine-tuning of reception characteristics after the wireless baseband LSI has been manufactured, thereby contributing to shorter development lead times.
Details of the new technology will be presented at the 2013 International Symposium on VLSI Design, Automation and Test (2013 VLSI-DAT), scheduled to open on April 22 in Taiwan.
In recent years, smartphones, tablets, and other wireless devices have rapidly gained in popularity. As the speed of wireless networks increase, manufacturers have launched models that support new wireless communications standards such as LTE, which is currently being rolled out worldwide, in addition to conventional standards such as GSM and 3G. To support these wireless standards, a signal processing circuit (baseband processor) compliant with each computation is required. As a result, being able to reduce the size and power consumption of baseband processor components is crucial to improving the cost and battery life of devices.
Typical baseband processing circuits are implemented using specialized hardware to support each communications standard, requiring a number of circuits to support different standards. Consequently, reducing the size of circuits has proved challenging. On the other hand, there exist alternative software-based approaches to supporting each communications standards using DSPs. Given the massive signal processing requirements of LTE, however, these approaches are limited in their ability to simultaneously achieve both high processing performance and low energy consumption.
Newly Developed Technology
Fujitsu Laboratories has developed a new DSP that employs a vector processing architecture used in supercomputers. The DSP can efficiently run highly repetitive processes that are common in LTE and other wireless processes, thereby achieving greater energy efficiency.
Key features of the newly developed DSP are as follows:
1) Vector processing architecture
The DSP employs a vector architecture found in supercomputers. With a typical processor, a single instruction will be executed on a single piece of data at a time (scalar data). By contrast, a vector processor will execute a single instruction on multiple pieces of data (vector data) at a time. As a result, when repeating the same process for multiple data elements, the ability of the vector architecture to complete a task with a single instruction makes it possible to cut down on the amount of processing and energy required to read and decode instructions from the memory.
LTE uses the OFDM(3) modulation method and communicates by bundling data that is carried by up to 1,200 "subcarriers" in a wireless signal. To extract information from an incoming signal, the DSP must apply the same process repeatedly for each subcarrier. This makes the vector approach more effective.
Figure 1 shows a block diagram of the newly developed DSP. The DSP consists of a vector engine, which employs a vector architecture, as well as a conventional CPU. The CPU reads in program code from the instruction memory, decodes the instruction, and if the vector approach is suitable for use on the instruction, it is transferred to the vector engine where it is executed. Other instructions are executed in the CPU as usual.
2) Vector engine optimized for baseband processing
Figure 2 shows an internal diagram of the vector engine itself. Instructions transferred from the CPU are stored in the instruction buffer. Stored instructions are decoded one by one by the sequencer, and the required vector processing pipeline(4) is controlled to execute the instruction. The number of vector data (vector length) that can be calculated in a single instruction is 64 data elements, a value optimized for use in mobile device baseband processing. Rather than processing 64 data elements sequentially, eight parallel processing elements process the data in eight rounds, thereby achieving higher speeds.
The vector engine features a small circuit size, and to increase the efficiency of baseband processing, there are two pipelines for processing multiply instructions on vector data, and there are also two pipelines for processing load instructions (or conversely, store instructions) that load vector data to the register file, which temporarily stores data from memory, for a total of four pipelines. All four pipelines can process addition, subtraction, and logic operations on vector data.
With the addition of the newly developed vector engine, many pieces of data can be processed with a single instruction, thereby enabling more efficient data processing. This, in turn, will significantly contribute to reduced energy consumption in wireless baseband LSIs. A DSP using 28nm process technology and running at 250 MHz is able to process 12 GOPS (12 billion operations per second). Fujitsu Laboratories succeeded in developing a compact DSP that measures only 0.4 mm2 (without memory) and, in terms of power consumption, requires only 30mW - a 20% improvement over existing DSPs.
The new DSP is expected to help lengthen talk times, usage times and standby times for smartphones and other mobile phones. In addition, revisions to the signal processing algorithm can be implemented through modifications to the DSP program, enabling fine-tuning of reception characteristics after the wireless baseband LSI has been manufactured, thereby contributing to shorter development lead times.
The new DSP will be incorporated into a communications processor from Access Network Technology Limited that is scheduled for use in Fujitsu smartphones and elsewhere. Going forward, Fujitsu Laboratories plans to continue making performance improvements to the processor to enable it to keep pace with advances in higher speed wireless communications standards.
(1) Vector processing architecture: A processor architecture for processing calculations on vector data (a one-dimensional array of data) with a single instruction.
(2) Long Term Evolution (LTE): The name of the latest mobile communications standard for wireless devices.
(3) Orthogonal Frequency-Division Multiplexing (OFDM): A wireless modulation encoding used also in wireless LANs.
(4) Processing pipeline: A circuit that executes an arithmetic process according to an instruction. By dividing execution into multiple stages, the calculation is executed according to a workflow process.
For further details with diagrams, please visit www.fujitsu.com/global/news/pr/archives/month/2013/20130201-04.html.
About Fujitsu Limited
Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Over 170,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE:6702) reported consolidated revenues of 4.5 trillion yen (US$54 billion) for the fiscal year ended March 31, 2012. For more information, please see www.fujitsu.com.
Source: Fujitsu Limited
Fujitsu Limited Public and Investor Relations www.fujitsu.com/global/news/contacts/ +81-3-3215-5259
Copyright 2013 JCN Newswire. All rights reserved. www.japancorp.net
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
Jan. 21, 2017 07:45 AM EST Reads: 2,438
Every successful software product evolves from an idea to an enterprise system. Notably, the same way is passed by the product owner's company. In his session at 20th Cloud Expo, Oleg Lola, CEO of MobiDev, will provide a generalized overview of the evolution of a software product, the product owner, the needs that arise at various stages of this process, and the value brought by a software development partner to the product owner as a response to these needs.
Jan. 21, 2017 07:30 AM EST Reads: 1,290
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 21, 2017 07:15 AM EST Reads: 928
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
Jan. 21, 2017 06:45 AM EST Reads: 1,946
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive ad...
Jan. 21, 2017 06:30 AM EST Reads: 1,654
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, discussed the best practices that will ensure a successful smart city journey.
Jan. 21, 2017 06:00 AM EST Reads: 2,071
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Jan. 21, 2017 05:30 AM EST Reads: 3,551
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Jan. 21, 2017 05:00 AM EST Reads: 3,011
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Jan. 21, 2017 05:00 AM EST Reads: 4,038
Technology vendors and analysts are eager to paint a rosy picture of how wonderful IoT is and why your deployment will be great with the use of their products and services. While it is easy to showcase successful IoT solutions, identifying IoT systems that missed the mark or failed can often provide more in the way of key lessons learned. In his session at @ThingsExpo, Peter Vanderminden, Principal Industry Analyst for IoT & Digital Supply Chain to Flatiron Strategies, will focus on how IoT depl...
Jan. 21, 2017 03:45 AM EST Reads: 1,988
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 21, 2017 02:30 AM EST Reads: 6,120
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors an...
Jan. 21, 2017 02:30 AM EST Reads: 5,042
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Jan. 21, 2017 02:15 AM EST Reads: 5,365
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it m...
Jan. 21, 2017 02:00 AM EST Reads: 5,822
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet and...
Jan. 21, 2017 01:30 AM EST Reads: 6,567