|By JCN Newswire||
|May 29, 2014 12:53 AM EDT||
The new product is the most suitable for scientific computing that requires super-high-speed parallel processing and large-scale data analysis. NEC will contribute to the advancement of ICT infrastructure that supports social development and innovation by providing it to research institutes and enterprises.
The SX-ACE is a new vector supercomputer equipped with a multi-core vector CPU, which enables the world's top-level single-core performance of 64 GFLOPS and the largest memory bandwidth per core of 64 GB/s(1). Its performance per rack has improved 10 times over previous models(2), with a rack computing performance of 16 teraFLOPS (hereinafter "TFLOPS") and a memory bandwidth of 16 Tbytes/second. It is especially suited for scientific and engineering computing applications and data-intensive applications that need high-speed processing of big data. It achieves high-sustained performance in various simulations - for weather forecasting, analysis of global environmental changes, fluid-dynamics analysis, nanotechnology, development of new materials and others.
Using NEC's leading edge LSI technology, a high-density packaging design, and high-efficiency cooling technology, the SX-ACE also reduces power consumption by 90% and requires 20% of the floor space of existing models(2).
Cyberscience Center, Tohoku University
Supercomputer system scheduled to be launched in October 2014
At facilities for joint usage nation-wide in Japan, the Cyberscience Center of Tohoku University promotes deployment and operation of the latest academic information infrastructure, including high-performance computations and networks, and education and research concerning creation of new cyber-science utilizing this latest academic information infrastructure. The SX-ACE for which NEC received orders is a system of 40 racks (2,560 nodes), with maximum theoretical computation performance significantly increased to 706 TFLOPS - more than 25 times that of existing systems.
Professor Hiroaki Kobayashi, director of the Cyberscience Center at Tohoku University, said the Center "has utilized vector supercomputers since the first deployment of the SX-1 in 1986, and the successive systems boast a high usage rate. I consider that, in addition to the SX series' highly effective performance and ease of programming, end users appreciate the Center's activities, such as support for performance enhancement of programs. We want to promote user support to bring out the best of potential performance of the new SX-ACE supercomputer system, and joint research concerning acceleration and advancement of simulations through close cooperation among users, vendor engineers, and the Center's researchers. We also want to promote the latest academic research in various fields, including emerging research and even practical research, and to contribute to the advancement of Japan's HPCI system. We will work on these tasks by utilizing the results for exascale computing based on the 'Feasibility Study of a Future HPCI System for Memory Intensive Applications,' which was implemented with support from the Ministry of Education, Culture, Sports, Science and Technology (MEXT)."
Cybermedia Center (CMC), Osaka University
Supercomputer system scheduled to be launched in December 2014
The Cybermedia Center (CMC) of Osaka University accumulates diverse knowledge and results on the latest large-scale computations, information technology, multimedia contents and education at Osaka University, and closely works with education and research organizations inside and outside the university, and the industrial world, to facilitate advanced technology transfer. It also aims to serve as a center that responds to internationalization and the advanced information society, and is open to local communities. The new SX-ACE with 24 racks (1,536 nodes, maximum theoretical performance of 423TFLOPS) will be operated in conjunction with scalar-type supercomputers for use in a wide range of fields.
Professor Shojiro Nishio, director of the Cybermedia Center (CMC) at Osaka University, said the Center "has developed excellent operation technologies, carefully taking into account usability, jointly with NEC since its deployment of the SX-1 back in 1986. As a noteworthy outcome of this development, the 'fair-share scheduling system' being operated at this center enjoys a high usage. Currently, supercomputing is faced with various issues that need to be addressed, such as ever-growing demand for power and handling of big data. In order to overcome these issues, the Center will build a new facility this fiscal year called the IT Core Building that focuses on efficient air-conditioning for supercomputer operations by aiming to be the world's most environment-friendly supercomputing center, and will continue to strive for solutions for challenging and competing goals for ever-increasing demand toward scientific computing capabilities and power constraints.
In the last fiscal year, the Center introduced a reconfigurable PC cluster and a large-scale visualization system, both of which will be used in combination with the SX-ACE vector supercomputer to be installed in this fiscal year. By utilizing such an infrastructure, the Center will actively work on promotion of the research in e-science as the fourth scientific method and the fostering of human resources that can serve as the basis for the next-generation of scientific computing, as well as the advancement and deepening of IT technologies that support future scientific and engineering computing such as real-time visualization of massive computational results."
National Institute for Environmental Studies
Supercomputer system scheduled to be launched in June 2015
The National Institute for Environmental Studies has been working on research of diverse environmental issues by utilizing its expertise and the latest facilities. Issues addressed by the institute range from the local level to the global level, from air pollution and water quality to the environmental impact by chemicals to global warming. The new supercomputer is a successor of the 8-node SX-9 system that was introduced as a first stage of deployment last year. It will be upgraded to the SX-ACE 384 node in the future and used for environmental research in various fields, including global environmental studies. Operation of the SX-ACE will commence in June 2015.
By using its vector technology, NEC will examine development of the next generation of high-performance servers targeting industrial application fields and big data analysis, in addition to conventional supercomputer fields and leading the latest development in the future.
Based on its Mid-term Management Plan 2015, the NEC Group is working towards the safety, security, efficiency and equality of society. NEC aims to develop solutions for a wide range of issues as a company that creates value through the promotion of its "Solutions for Society" which provide advanced social infrastructure utilizing ICT.
(1) According to NEC data as of May 29.
Core performance: 64GFLOPS (for vector additions and multiplications)
Maximum theoretical core performance: 69GFLOPS (for all concurrently available operations)
(2) According to NEC data as of May 29 comparing the existing model SX-9 with the new model.
About NEC Corporation
NEC Corporation (TSE: 6701) is a leader in the integration of IT and network technologies that benefit businesses and people around the world. By providing a combination of products and solutions that cross utilize the company's experience and global resources, NEC's advanced technologies meet the complex and ever-changing needs of its customers. NEC brings more than 100 years of expertise in technological innovation to empower people, businesses and society. For more information, visit NEC at http://www.nec.com.
Source: NEC Corporation
Seiichiro Toda NEC Corporation [email protected] +81-3-3798-6511
Copyright 2014 JCN Newswire. All rights reserved. www.japancorp.net
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Feb. 24, 2017 06:00 AM EST Reads: 1,924
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Feb. 24, 2017 04:00 AM EST Reads: 3,790
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Feb. 24, 2017 03:00 AM EST Reads: 1,816
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Feb. 24, 2017 02:15 AM EST Reads: 13,198
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Ca...
Feb. 24, 2017 01:45 AM EST Reads: 9,538
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Feb. 24, 2017 01:45 AM EST Reads: 3,773
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Feb. 24, 2017 01:30 AM EST Reads: 5,731
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
Feb. 24, 2017 01:15 AM EST Reads: 1,899
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Feb. 24, 2017 01:00 AM EST Reads: 1,895
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Feb. 24, 2017 01:00 AM EST Reads: 2,534
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Feb. 24, 2017 12:45 AM EST Reads: 970
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, introduced the technologies required for implementing these idea...
Feb. 23, 2017 11:30 PM EST Reads: 6,299
Why do your mobile transformations need to happen today? Mobile is the strategy that enterprise transformation centers on to drive customer engagement. In his general session at @ThingsExpo, Roger Woods, Director, Mobile Product & Strategy – Adobe Marketing Cloud, covered key IoT and mobile trends that are forcing mobile transformation, key components of a solid mobile strategy and explored how brands are effectively driving mobile change throughout the enterprise.
Feb. 23, 2017 11:00 PM EST Reads: 7,020
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...
Feb. 23, 2017 10:00 PM EST Reads: 4,609
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
Feb. 23, 2017 09:15 PM EST Reads: 1,529