Click here to close now.


News Feed Item

TransCanada Disappointed and Frustrated with Keystone Delay

CALGARY, ALBERTA -- (Marketwired) -- 04/21/14 -- TransCanada Corporation (TSX:TRP) (NYSE:TRP) (TransCanada) today released the following statement from Russ Girling, TransCanada's President and Chief Executive Officer, related to another process delay announced by the U.S. Department of State (DOS) April 18, 2014 with respect to the permitting process for Keystone XL.

DOS said on April 18 that it will seek the input of eight federal agencies in the course of assessing the National Interest Determination. DOS had previously asked for their views by early May. State has now notified agencies that it is extending the period and providing more time for the submission of their views on the proposed project. A core reason for that, in the words of DOS, is the potential impact of the Nebraska Supreme Court case which could ultimately affect the pipeline route.

Statement from Russ Girling:

"We are extremely disappointed and frustrated with yet another delay. American men and women will miss out on another construction season where they could have worked to build Keystone XL and provided for their families. We feel for them.

We are also disappointed the United States will continue to rely on regimes that are fundamentally opposed to American values for the eight to nine million barrels of oil that is imported every day. A stable, secure supply of oil from Canada and from the U.S. makes better sense and I am sure a majority of Americans agree.

Another delay is inexplicable. The first leg of our Keystone pipeline began shipping oil to refineries outside of St. Louis in 2010. It is about the same length of pipe as Keystone XL, carries the same oil and also crosses the 49th parallel. It took just 21 months to study and approve. After more than 2,000 days, five exhaustive environmental reviews and over 17,000 pages of scientific data Keystone XL continues to languish. Our Keystone pipeline has safely delivered more than 600 million barrels of crude oil to U.S. refineries, replacing foreign off-shore oil.

The Nebraska routing situation is being managed appropriately. A notice of appeal was filed by Nebraska's Attorney General in February on the same day as the district court judge's decision regarding LB1161. This action 'stays' the lower court decision, meaning LB1161 is still valid and the Keystone XL re-route in Nebraska that was evaluated by the Nebraska Department of Environmental Quality and approved by the Governor remains in effect.

Our view remains that the current 90-day National Interest Determination process that is now underway should not be impacted by the Nebraska lower court ruling since the approved re-route remains valid during appeal.

North American oil production is up dramatically and will continue to rise. That means without Keystone more oil will be shipped by rail and by barge. As the State Department concluded in its recent Final Supplemental Environmental Impact Statement not approving Keystone XL will lead to higher GHG's (greenhouse gas emissions) through other oil transportation options and greater public risk. Not building Keystone XL is a lose, lose, lose scenario any way you look at it.

Keystone XL improves American energy security, minimizes the environmental and safety impacts of moving that oil to U.S. refineries, helps contribute to jobs and American businesses and continues to have the support of a strong majority of Americans and Congress. It is truly in the national interest of America and a majority of Americans in poll after poll after poll continue to agree and just want this pipeline built.

It is unfortunate that interest groups and paid activists are blocking energy security, saying no to jobs, and creating a situation that actually leads to higher GHG's and greater public risk. Canadian oil will make its way to market with or without Keystone XL. It is in everyone's best interests that this project move forward."

With more than 60 years' experience, TransCanada is a leader in the responsible development and reliable operation of North American energy infrastructure including natural gas and oil pipelines, power generation and gas storage facilities. TransCanada operates a network of natural gas pipelines that extends more than 68,500 kilometres (42,500 miles), tapping into virtually all major gas supply basins in North America. TransCanada is one of the continent's largest providers of gas storage and related services with more than 400 billion cubic feet of storage capacity. A growing independent power producer, TransCanada owns or has interests in over 11,800 megawatts of power generation in Canada and the United States. TransCanada is developing one of North America's largest oil delivery systems. TransCanada's common shares trade on the Toronto and New York stock exchanges under the symbol TRP. For more information visit: or check us out on Twitter @TransCanada or

TransCanada Media Enquiries:
Shawn Howard/Grady Semmens/Davis Sheremata
403.920.7859 or 800.608.7859

TransCanada Investor & Analyst Enquiries:
David Moneta/Lee Evans
403.920.7911 or 800.361.6522

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
DevOps has often been described in terms of CAMS: Culture, Automation, Measuring, Sharing. While we’ve seen a lot of focus on the “A” and even on the “M”, there are very few examples of why the “C" is equally important in the DevOps equation. In her session at @DevOps Summit, Lori MacVittie, of F5 Networks, will explore HTTP/1 and HTTP/2 along with Microservices to illustrate why a collaborative culture between Dev, Ops, and the Network is critical to ensuring success.
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, ...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
In his session at DevOps Summit, Bryan Cantrill, CTO at Joyent, will demonstrate a third path: containers on multi-tenant bare metal that maximizes performance, security, and networking connectivity.
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at di...
For almost two decades, businesses have discovered great opportunities to engage with customers and even expand revenue through digital systems, including web and mobile applications. Yet, even now, the conversation between the business and the technologists that deliver these systems is strained, in large part due to misaligned objectives. In his session at DevOps Summit, James Urquhart, Senior Vice President of Performance Analytics at SOASTA, Inc., will discuss how measuring user outcomes –...
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.