Welcome!

News Feed Item

Gulf Coast Project Begins Delivering Crude Oil to Nederland, Texas

NEDERLAND, TEXAS -- (Marketwired) -- 01/22/14 -- TransCanada Corporation (TSX: TRP) (NYSE: TRP) (TransCanada) announced today that at approximately 10:45 a.m. CST on January, 22, 2014, the Gulf Coast Project began delivering crude oil on behalf of our customers to Texas refineries. The completion of this US$2.3 billion crude oil pipeline provides a safe and direct connection between the important oil hub in Cushing, Oklahoma and delivery points on the U.S. Gulf Coast.

"This is a very important milestone for TransCanada, our shippers and Gulf Coast refiners who have been waiting for a pipeline to supply oil directly from Cushing," said Russ Girling, president and chief executive officer. "This project is a critical, modern piece of American energy infrastructure that allows producers to safely connect growing production with the world's most efficient refiners on the U.S. Gulf Coast. It also provides those American refineries the opportunity to use more of the crude oil produced in both Canada and the United States for decades to come."

Construction of the 487-mile crude oil pipeline involved more than 11 million hours of labor completed by 4,844 workers in the United States of America, more than 50 contracts with manufacturers and companies building the pipeline and equipment from across the U.S. It also includes the addition of 2.25 million barrels of new oil storage capacity at Cushing, Oklahoma.

"The workers who helped build this project are in addition to 8,969 men and women who constructed the initial Keystone Pipeline system, and we are waiting for approval of Keystone XL so we can employ more than 9,000 more Americans who are waiting to put their skills and experience to work," added Girling.

The Gulf Coast Project was designed to help relieve the glut of crude oil in places like Cushing, Oklahoma and will transport growing supplies of U.S. supply to meet refinery demand. It provides Gulf Coast refineries with access to lower cost domestic production and reduces America's reliance on foreign sources of crude oil. In addition, the standards that TransCanada adopted for the Gulf Coast Project have set a new bar for safety and design of modern crude oil pipelines. This includes a higher number of remotely controlled shutoff valves, increased pipeline inspections, increased standards for pipeline construction, maintenance and integrity, and burying the pipe deeper in the ground.

The U.S. is the largest oil consumer in the world and uses 15 million barrels of crude oil every single day. The latest data shows that about half of that is imported. Even with growing U.S. production and increasing fuel efficiency standards, the International Energy Agency and the U.S.'s own Energy Information Administration forecast the need for between four to six million barrels of imported crude oil a day until 2040.

"As we bring the Gulf Coast Project into commercial operation, and look forward to the final review for Keystone XL, it is important to remember that we have a choice about where to get the oil we need to maintain our quality of life," concluded Girling. "That choice is stable American and Canadian oil transported through our Keystone system versus higher priced, unstable crude oil from countries such as Venezuela that do not share or support American values."

The Gulf Coast Project is a 487-mile (780-kilometer), 36-inch crude oil pipeline beginning in Cushing, Oklahoma, and extending south to Nederland, Texas, to serve the Gulf Coast marketplace. The Gulf Coast Project will have the initial capacity to transport up to 700,000 barrels of oil per day with the potential to transport up to 830,000 barrels of oil per day to Gulf Coast refineries. TransCanada is currently projecting pipeline capacity of 520,000 barrels per day for the first year of operation. For more information, please visit Gulf-Coast-Pipeline.com.

The 48-mile (77-kilometer), Houston Lateral Project is an additional project under development to transport crude oil to refineries in the Houston area. All permissions necessary to the project are in place and construction is underway on the project.

With more than 60 years' experience, TransCanada is a leader in the responsible development and reliable operation of North American energy infrastructure including natural gas and oil pipelines, power generation and gas storage facilities. TransCanada operates a network of natural gas pipelines that extends more than 68,500 kilometres (42,500 miles), tapping into virtually all major gas supply basins in North America. TransCanada is one of the continent's largest providers of gas storage and related services with more than 400 billion cubic feet of storage capacity. A growing independent power producer, TransCanada owns or has interests in over 11,800 megawatts of power generation in Canada and the United States. TransCanada is developing one of North America's largest oil delivery systems. TransCanada's common shares trade on the Toronto and New York stock exchanges under the symbol TRP. For more information visit: www.transcanada.com or check us out on Twitter @TransCanada or http://blog.transcanada.com.

FORWARD LOOKING INFORMATION

This publication contains certain information that is forward-looking and is subject to important risks and uncertainties (such statements are usually accompanied by words such as "anticipate", "expect", "believe", "may", "will", "should", "estimate", "intend" or other similar words). Forward-looking statements in this document are intended to provide TransCanada security holders and potential investors with information regarding TransCanada and its subsidiaries, including management's assessment of TransCanada's and its subsidiaries' future plans and financial outlook. All forward-looking statements reflect TransCanada's beliefs and assumptions based on information available at the time the statements were made and as such are not guarantees of future performance. Readers are cautioned not to place undue reliance on this forward-looking information, which is given as of the date it is expressed in this news release, and not to use future-oriented information or financial outlooks for anything other than their intended purpose. TransCanada undertakes no obligation to update or revise any forward-looking information except as required by law. For additional information on the assumptions made, and the risks and uncertainties which could cause actual results to differ from the anticipated results, refer to TransCanada's Quarterly Report to Shareholders dated November 4, 2013 and 2012 Annual Report on our website at www.transcanada.com or filed under TransCanada's profile on SEDAR at www.sedar.com and with the U.S. Securities and Exchange Commission at www.sec.gov.

Contacts:
TransCanada
Media Enquiries:
Shawn Howard/Grady Semmens/Davis Sheremata
403.920.7859 or 800.608.7859

TransCanada
Investor & Analyst Enquiries:
David Moneta/Lee Evans
403.920.7911 or 800.361.6522
www.transcanada.com

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
DevOps with IBMz? You heard right. Maybe you're wondering what a developer can do to speed up the entire development cycle--coding, testing, source code management, and deployment-? In this session you will learn about how to integrate z application assets into a DevOps pipeline using familiar tools like Jenkins and UrbanCode Deploy, plus z/OSMF workflows, all of which can increase deployment speeds while simultaneously improving reliability. You will also learn how to provision mainframe syste...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determin...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Digital transformation has increased the pace of business creating a productivity divide between the technology haves and have nots. Managing financial information on spreadsheets and piecing together insight from numerous disconnected systems is no longer an option. Rapid market changes and aggressive competition are motivating business leaders to reevaluate legacy technology investments in search of modern technologies to achieve greater agility, reduced costs and organizational efficiencies. ...
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and...