|By Business Wire||
|August 26, 2014 01:00 AM EDT||
KDDI (TOKYO:9433), the global telecommunications provider, announced today that it will invest 270 million dollars (28 billion yen) in building two new TELEHOUSE data centers, “TELEHOUSE OSAKA 2” and “TELEHOUSE TOKYO Tama 3”, scheduled to open in August 2015 and February 2016 respectively. These additional facilities will take the total amount of global TELEHOUSE data center space to 371,000 sqm provided by 46 sites across 13 countries/territories and 24 major cities.
TELEHOUSE TOKYO Tama 3 (Photo: Business Wire)
Both facilities will offer high-density colocation services enabling the hosting of heavy load IT infrastructure. The data centers will meet the growing demand for housing private enterprise clouds along with public cloud service providers, online and media content companies. The facilities are designed to Tier 3+ data center standards in redundancy and uptime. It will be insusceptible to earthquakes by a long-period absorption structure, reflecting the high quality Global TELEHOUSE specifications. In addition, KDDI can provide a combined solution by providing direct access to the KDDI global network along with the data center.
TELEHOUSE Osaka 2, in the center of Osaka city, will offer 700 racks of tenancy space with up to 30kVA (designed) per rack. The technical and operations room will be situated above second floor avoiding the risk of potential flooding. Osaka 2 can be used as a disaster recovery and back-up site for the Tokyo data centers.
TELEHOUSE Tokyo Tama 3 will be located on the existing Tama data center campus, 30km from the Tokyo city center in a highly guarded surrounding. The five storey building will offer 1,300 racks of tenancy space with up to 42kVA (designed) power supply to racks, the highest*1 in Japan and 5 times*2 industry average, with a designed PUE 1.31 making it one of the most energy efficient data centers in Japan.
This expansion reflects the growing demand for data center space in Japan’s metros and the ability of KDDI to accommodate leading multinationals expanding their portfolio in Japan. By securing their domestic market share in Japan, KDDI continues to be the leader in premium data center facilities.
TELEHOUSE is the pioneering data center colocation provider originally
established in 1989. It manages and operates 46 carrier-neutral data
centers globally, serving over 2000 companies and hosting some of the
world's leading internet exchanges. Designed for mission-critical IT
system housing and connectivity, TELEHOUSE supports the infrastructure
of the world's leading network operators, cloud providers and financial
KDDI Corporation is a leading Japanese telecommunications and a Global
Fortune 300 company. Serving 40 million domestic mobile subscribers,
KDDI provides a diverse portfolio consisting of managed networks, data
centers, cloud, security and system integration.
(*1） As of Aug 26. Research from KDDI based on public information.
(*2） Compared with TELEHOUSE data centers in Japan.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
Jul. 24, 2016 10:00 PM EDT Reads: 1,940
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Jul. 24, 2016 09:45 PM EDT Reads: 2,116
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Jul. 24, 2016 09:45 PM EDT Reads: 1,902
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Jul. 24, 2016 09:45 PM EDT Reads: 1,624
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 24, 2016 09:00 PM EDT Reads: 1,463
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 24, 2016 09:00 PM EDT Reads: 2,457
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jul. 24, 2016 07:45 PM EDT Reads: 1,846
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
Jul. 24, 2016 07:30 PM EDT Reads: 1,706
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 24, 2016 07:30 PM EDT Reads: 1,687
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 24, 2016 07:30 PM EDT Reads: 2,048
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Jul. 24, 2016 07:15 PM EDT Reads: 1,854
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 24, 2016 07:00 PM EDT Reads: 1,926
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Jul. 24, 2016 06:45 PM EDT Reads: 2,034
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 24, 2016 06:45 PM EDT Reads: 1,756
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 24, 2016 06:15 PM EDT Reads: 1,376