Welcome!

News Feed Item

Stream Data Centers Announces New Dallas-Area Hyperscale Data Center Development

DFW VII will be Stream's next hyperscale data center for cloud and enterprise customers, underway for late 2018 commissioning in Garland, Texas

DALLAS, April 17, 2018 /PRNewswire/ -- Stream Data Centers is pleased to announce that it has acquired approximately 23 acres for construction of their new Dallas-area campus development in Garland, Texas. When completed in late 2018, the newly-constructed DFW VII facility will offer an expandable 140,000 square foot data center with redundant 40 MW utility feeds from an on-site substation provided by Oncor.  Ultimately the campus will total approximately 400,000 square feet.

"Stream's DFW VII data center will follow a successful formula from our previous Dallas-area developments and benefit from the best practices and improvements we've made along the way," states Paul Moser, Co-Managing Partner of Stream Data Centers.

Stream Data Centers' new DFW campus development will feature:

  • A 22.66 acre site in Garland, Texas located on Lookout Drive, in close proximity to robust Northeast Dallas, Richardson and Garland amenities, power and fiber infrastructure.
  • Initial 140,000 square-foot structurally-enhanced data center with land available for multiple phases.
  • Two (2) 40 MW utility feeds from a new on-site Oncor substation offering power at competitive transmission rates.
  • Two (2) diverse telco entrances, with multiple fiber providers and a strong mix of local, long haul and dark fiber providers to the site through multiple routes.
  • Designed to meet or exceed size and capital investment requirements under House Bill 1223 sales tax exemption program.
  • Optimal location outside of flight paths, railways and FEMA 500 year flood plain.

Stream has worked with multiple network service providers to secure diverse dark fiber paths back to the major carrier hotels and cloud interconnection locations, allowing our customers to expand beyond the providers that are in the immediate area with access to 100+ network and cloud providers.  The site will offer numerous different paths between any major interconnection point or data center in the metro area.

"Our new development in Garland, Texas seeks to address the growing needs of cloud companies and enterprise users in and around the Dallas market," Moser said. "We believe that our DFW VII data center will meet the needs of companies looking for highly-secure and resilient data center space with low-latency connectivity in the Dallas market."  

ABOUT STREAM DATA CENTERS

Since 1999, Stream has been an active investor and industry leader, providing premium services, optimized value and critical environments to Fortune 500 companies. To date, Stream has acquired, developed and operated more than two million square feet of data center space in Texas, Minnesota, Illinois, California and Colorado, representing more than 200 megawatts of power.

Stream develops and operates highly resilient, scalable and efficient data centers, with products including fully-commissioned Hyperscale Data Centers, Private Data Center™ Suites, Ready-to-Fit™ Powered Shells, Build-to-Suit Infrastructure and Scalable Colocation Environments – all with immediate connection to network carriers and public cloud providers.

Stream Data Centers is committed to improving the data center experience through exceptional people and service. Services supporting critical environments and energy procurement leverage the combined skill sets and resources of Stream's technical real estate professionals with fine-tuned data center and energy expertise, to deliver end-to-end solutions for mission-critical infrastructure needs. 

By understanding the dynamics of the data center market, Stream is able to forecast customer demand to proactively and cost-effectively deploy the right products at the right time. These disciplines, aligned with significant capital and industry expertise, keep Stream customers ahead of the data center planning curve. Learn more at www.streamdatacenters.com.

Media Contact
Mary McAuliffe
[email protected]
214-560-2427

 

Stream Data Center (PRNewsFoto/Stream Data Centers)

Cision View original content with multimedia:http://www.prnewswire.com/news-releases/stream-data-centers-announces-new-dallas-area-hyperscale-data-center-development-300631340.html

SOURCE Stream Data Centers

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
For better or worse, DevOps has gone mainstream. All doubt was removed when IBM and HP threw up their respective DevOps microsites. Where are we on the hype cycle? It's hard to say for sure but there's a feeling we're heading for the "Peak of Inflated Expectations." What does this mean for the enterprise? Should they avoid DevOps? Definitely not. Should they be cautious though? Absolutely. The truth is that DevOps and the enterprise are at best strange bedfellows. The movement has its roots in t...
Learn how to solve the problem of keeping files in sync between multiple Docker containers. In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience. In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so yo...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Digital transformation has increased the pace of business creating a productivity divide between the technology haves and have nots. Managing financial information on spreadsheets and piecing together insight from numerous disconnected systems is no longer an option. Rapid market changes and aggressive competition are motivating business leaders to reevaluate legacy technology investments in search of modern technologies to achieve greater agility, reduced costs and organizational efficiencies. ...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.