|By Marketwired .||
|November 28, 2012 04:46 PM EST||
LOS ANGELES, CA -- (Marketwire) -- 11/28/12 -- T5 Data Centers (www.t5datacenters.com), innovators in providing state-of-the-art, customizable, and highly reliable computing support for any enterprise, today announced the completion and commissioning of its new [email protected] data center at 444 North Nash Street, El Segundo, California. The T5 team has successfully commissioned the first 3.7-megawatts (3.7 MWs) of what will ultimately be 16.65 MWs of critical power, so the new data center is ready for mission-critical operations for lease to enterprise customers.
In addition, T5 announced the signing of the first lease at [email protected] The Los Angeles based critical infrastructure agency is the first enterprise customer and will occupy a custom-built data hall. They selected [email protected] because of the facility's seismic rating, dedicated power supply, and redundant systems that ensure uninterrupted service. They also selected T5 Data Centers because of the company's reputation for reliability and data security. Occupancy will be in 2013.
The new [email protected] data center is a purpose-built, 205,000-square-foot building on a 6.1 acre site. Designed as a high-quality, fully equipped wholesale data center, the facility is customizable to customer needs and features 120,000 square feet of raised floor. The building is built from structurally enhanced steel and has been designed to withstand earthquakes with a seismic importance factor of 1.5. [email protected] also has a dedicated, on-site power substation to ensure high reliability of electricity at a reduced "wholesale" cost.
"Los Angeles has been an underserved market for server-ready data center services" said Pete Marin, President of T5. "[email protected] is the first purpose built wholesale data center in the Los Angeles basin. We're seeing great demand for our high quality and super-efficient design."
The new [email protected] data center was designed for large enterprise customers seeking a customizable, "always-on" computing environment. The facility takes advantage of cool coastal air to help chill the data halls via an indirect evaporative process that uses approximately half the energy of traditional data center designs. T5 maintains round-the-clock engineering and support staff, as well as redundant systems to deliver the best service possible as well as peace of mind. The company already supports a number of Fortune 200 companies at its other data center facilities.
"We are very excited to welcome our first tenant to [email protected]," said Aaron Wangenheim Executive Vice President of T5, responsible for marketing and leasing. "[email protected] was chosen by this discerning customer because of our ability to deliver the right solution and the right efficiencies. Their critical applications required the most resilient data center possible, and we're excited to welcome them to [email protected] "
About T5 Data Centers
T5 Data Centers (T5) is a leading national data center owner and operator, committed to delivering customizable, scalable data centers that provide an "always on" computing environment. T5 Data Centers provides enterprise and wholesale data center services to organizations across North America using best-in-class technology and techniques, designing the MEP (mechanical, electrical, plumbing) plant to achieve a low Power Usage Effectiveness (PUE), thus delivering the lowest possible total cost of operations for its clients. T5 currently has business-critical data center facilities in Atlanta, Los Angeles, Dallas, and Charlotte with new projects announced in Portland and Colorado. All of T5's data center projects are purpose-built facilities with the robust design, power requirements, and redundancy, and have 24-hour staff to support mission-critical computing applications.
For more information, visit www.t5datacenters.com.
T5 Data Centers
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Jan. 20, 2017 03:00 AM EST Reads: 779
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Jan. 20, 2017 02:30 AM EST Reads: 5,012
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 20, 2017 02:15 AM EST Reads: 6,040
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Jan. 20, 2017 02:00 AM EST Reads: 3,616
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Jan. 20, 2017 02:00 AM EST Reads: 6,571
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Jan. 20, 2017 02:00 AM EST Reads: 5,320
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Jan. 20, 2017 01:45 AM EST Reads: 4,263
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 20, 2017 01:15 AM EST Reads: 2,138
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jan. 20, 2017 01:15 AM EST Reads: 1,344
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Jan. 20, 2017 12:45 AM EST Reads: 2,870
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Jan. 20, 2017 12:45 AM EST Reads: 4,108
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Jan. 20, 2017 12:45 AM EST Reads: 2,835
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jan. 20, 2017 12:00 AM EST Reads: 6,335
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Jan. 19, 2017 11:45 PM EST Reads: 9,924
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jan. 19, 2017 09:45 PM EST Reads: 6,824