Welcome!

News Feed Item

Tilera and Quanta Unveil the World's Most Power Efficient and Highest Compute Density Server

Using Tilera's TILEPro64(TM) Processors, the S2Q Server Packs up to 10,000 Cores in a Standard Rack That Consumes Less Than Eight Kilowatts of Power

SANTA CLARA, CA and SAN FRANCISCO, CA -- (Marketwire) -- 06/22/10 -- VELOCITY 2010 and STRUCTURE 2010 -- Tilera® Corporation, the world leader in many-core general purpose microprocessors for cloud computing and networking applications, and Quanta Computer Inc., one of world's largest computer ODMs, today unveiled the world's most power efficient and highest compute density server, codenamed S2Q. The S2Q server, targeted specifically to tackle today's cloud computing workloads, was designed in collaboration with cloud datacenter providers, end customers and software partners. It is targeted at large-scale datacenters running high performance web, database, hosting, and finance applications.

"This announcement is the validation of the real-world implications of our processors and what they offer," says Omid Tahernia, CEO, Tilera Corporation. "Cloud is changing the way we think about computing. There are new demands for power efficiency and density that are not met by the x86 technology. Tilera based servers are meeting these exact needs in the market."

"We are very excited about the S2Q server. This is a technological breakthrough, providing the high performance required at a fraction of the space and power budget," said Mike Yang, vice president of the Cloud Computing business unit, Quanta Computer Inc. "This server illustrates Quanta's continued leadership in server designs providing the latest in technology to the market."

Each S2Q server includes eight Tilera TILEPro64™ processors and replaces eight high-end Intel Xeon 5000-class dual-socket servers, making it the highest performance and performance density 2U server in the industry. It provides vendors the building block for large-scale web clouds. Moreover, the integration of I/O on each processor enables this server to provide up to sixteen 10 GbE interfaces and sixteen 1 Gb interfaces without adding the power and the cost of additional chipsets and networking cards.

World's highest density and highest compute 2U server

  • 8 nodes each containing the 64-core TILEPro64 processor
  • 512 cores providing up to 1.3 trillion operations per second
  • 176 Gbps of I/O bandwidth
  • Up to 64 DIMM slots
  • Up to twenty four 2.5" hot-plug SAS, SATA or solid state hard drives

Power efficient and eco-friendly server

  • Each server node consumes 35-50 watts max
  • S2Q servers enable up to 10,000 cores in a eight kilowatt rack
  • 90 percent efficient hot-plug power supplies
  • Shared fans and power supplies to conserve space and power for an eco-friendly design

Serviceability and management

  • Front-mounted 2.5" hot-pluggable hard drives
  • Four hot-pluggable 2-node trays
  • Hot-pluggable power supplies
  • IPMI 2.0 dedicated management ports

Tilera's many-core design is ideal for the cloud because cloud applications execute millions of small parallel tasks simultaneously, instead of very complicated single threaded programs, which require very big cores. The TILEPro64 processor features 64 cores running SMP Linux. Tilera's iMesh™ technology enables it to integrate many cores with coherent caches to deliver scalable performance.

The S2Q server will be available to customers in September 2010 in limited quantities and generally available in Q4 2010. For additional information on the S2Q server contact [email protected].

Tilera will be demonstrating the S2Q system on June 22-24 at Velocity 2010 in Santa Clara, Calif., and on June 23-24 at Structure 2010 in San Francisco.

About Tilera
Tilera® Corporation is the industry leader in general purpose multicore and many-core processors for cloud computing and communications applications. Tilera's processors are based on its breakthrough distributed iMesh™ architecture, which enables it to scale to hundreds of general purpose, low power, cores and continue to scale with new process technology. With this revolutionary architecture, a standard Linux environment, and standard GNU tools, Tilera delivers unprecedented general purpose compute capacity at a fraction of the power consumption of legacy processors. Tilera has three product families: the TILE64™, the TILEPro™ and its latest: the TILE-Gx. The company is headquartered in San Jose, Calif., with locations in Westborough, Mass., Yokohama, Japan, Beijing, and Shanghai, China.

All trademarks are the property of their respective owners.

FOR MORE INFORMATION:
Tara Sims
siliconPR for Tilera
Email Contact
415 310 5779

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.