Welcome!

News Feed Item

Dimension Data Opens New Cloud Data Center In Canada

New Managed Cloud Platform in Toronto to support growing cloud demand in the region

TORONTO, Sept. 3, 2014 /PRNewswire/ -- Dimension Data, the U.S. $6 billion global ICT solutions and services provider, has launched a new Managed Cloud Platform™ (MCP) in Toronto, Canada. The Toronto MCP is part of a global network of cloud data centers that deliver cloud services to clients across the globe. The new Toronto location brings the number of MCPs Dimension Data has deployed worldwide to 12. In addition to reducing latency and providing secure, easy-to-use cloud services, Dimension Data's Toronto MCP will meet in-country data requirements for Canadian organizations and address the growing demand for cloud services across the region.

Dimension Data logo

"The MCP's enhanced reliability, flexible capacity on-demand, automation and orchestration through our CloudControl and API-based integration offer clear benefits to clients and OneCloud partners in Canada and around the world," said Wendy Lucas, country manager, Dimension Data Canada. "We are pleased to offer this powerful platform with global functionality in the Canadian market."

Dimension Data's network of 12 MCPs offers clients a choice of global cloud data centers in which to deploy servers and storage. Using a web-based interface or API, clients can choose the Toronto destination, provision and configure virtual local area networks (VLANs) and firewalls, and report on administration and usage. As Dimension Data's first MCP in Canada, the Toronto location reduces latency through proximity and addresses data sovereignty concerns by enabling global enterprises to locate data inside Canada.

All Dimension Data MCPs provide 99.99% availability Service Level Agreements (SLAs), as well as 24/7 phone support and integrated management capabilities. Additionally, the new MCP has been built on Cisco, EMC and VMware architecture to provide multiple layers of security and administrative controls to users.

Steve Nola, group executive of Dimension Data's ITaaS Business Unit said, "The cloud economy contributes nearly CA $5 billion annually to Canadian GDP, and this contribution will continue to grow. Toronto is the data-center capital of Canada. The addition of Dimension Data's MCP in Toronto aligns well with future client growth and demand, provides scalability of client services and a consumption-based pricing model, and will benefit Canadian clients currently in other MCPs that are waiting to move operations there."

In May, Dimension Data announced its plans for  an MCP in New Zealand, which is expected to launch later this year. Other Dimension Data MCP locations are: Santa Clara, California and Ashburn, Virginia, U.S.; London, U.K.; Amsterdam, The Netherlands; Sydney and Melbourne, Australia; Johannesburg, South Africa; Tokyo, Japan; Hong Kong, China and Sao Paulo, Brazil.

Visit http://cloud.dimensiondata.com to learn more about Dimension Data and its cloud computing offerings.

About Dimension Data
Founded in 1983, Dimension Data plc is an ICT services and solutions provider that uses its technology expertise, global service delivery capability, and entrepreneurial spirit to accelerate the business ambitions of its clients. Dimension Data is a member of the NTT Group. Visit us at http://www.dimensiondata.com/en-US and www.facebook.com/DimensionDataAmericas or follow us on Twitter: @DimensionDataAM.

For further information please contact:


Nicole O'Brien

Karen Pantinas

Dimension Data Americas

Davies Murphy Group

T: 571-203-4117

T: 781-418-2413

E: [email protected]

E: [email protected]

Logo - http://photos.prnewswire.com/prnh/20120402/NE80686LOGO

SOURCE Dimension Data

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.