Welcome!

News Feed Item

PG&E Verifies Strength Of 194 Miles Of Pipeline In 2012; Begins Construction Of State-Of-The-Art Gas Control Center

More Gas System Safety Work Planned for 2013

SAN FRANCISCO, Dec. 18, 2012 /PRNewswire/ -- Building on its commitment to natural gas pipeline safety, Pacific Gas and Electric Company (PG&E) announced that it validated the safe operating pressure of an additional 194 miles of natural gas transmission pipeline in 2012, through hydrostatic pressure testing and rigorous records validation.

These activities were completed in areas throughout Northern and Central California as part of PG&E's Pipeline Safety Enhancement Program (PSEP) and through regularly scheduled projects to verify the integrity of natural gas transmission lines—the large-diameter pipes that carry gas across long distances. Since PSEP projects launched in 2011, PG&E has successfully tested a total of 409 miles of gas transmission pipeline.

PG&E also started construction on a state-of-the-art gas control center at its new Gas Operations headquarters at Bishop Ranch in San Ramon, Calif. The center will serve as a central location from which PG&E will monitor the safe and reliable operation of its 6,700 miles of transmission pipeline and 42,000 miles of smaller-diameter distribution mains. Completion of the gas control center is expected in Q2 2013.

"There is nothing more important than working to make our natural gas system safer in the communities we serve. We've made tremendous progress, but the work we've done is just the beginning," said Nick Stavropoulos, executive vice president of Gas Operations for PG&E. "Our goal is to operate the safest and most reliable natural gas system in the country."

Hydrostatic pressure testing is one of several important measures PG&E is taking to enhance the safety and strength of its natural gas system. Through the end of 2014, phase one of the PSEP program, the utility plans to pressure test or validate 783 miles of gas transmission pipeline, replace 185 miles of pipeline, automate more than 220 valves, and upgrade nearly 200 miles of pipeline to accommodate advanced in-line inspection tools known as "smart pigs."

"PSEP is a multi-year program that will enhance the safety and reliability of our natural gas transmission pipelines in communities throughout our service area. It will help us to assess our pipeline system and improve the delivery of safe, reliable and affordable natural gas to our customers," said Jesus Soto Jr., senior vice president of Gas Transmission Operations for PG&E.  

This year, in addition to validating the safe operating pressure of 194 miles of pipeline, PG&E has made significant progress in improving transmission pipeline safety and reliability including:   

  • Installing more than 34 miles of new transmission pipeline in urban areas.
  • Installing 37 automated valves in urban or active seismic fault crossing areas to allow for remote or automated shut-off in the event of a rupture. Fifty valves have been completed since the program commenced in 2011.   
  • Retrofitting 78 miles of pipeline to accommodate state-of-the-art in-line inspection tools known as "smart pigs."  

In 2013, the utility plans to further increase its pipeline safety actions, with plans to strength test more than 200 miles of pipeline, replace more than 60 miles of pipeline and automate 75 valves – all in addition to work completed to date.

To learn more about how PG&E conducts hydrostatic pressure testing and what to expect if testing is planned for your area, visit www.pge.com/gas.  

Pacific Gas and Electric Company, a subsidiary of PG&E Corporation (NYSE:PCG), is one of the largest combined natural gas and electric utilities in the United States. Based in San Francisco, with 20,000 employees, the company delivers some of the nation's cleanest energy to 15 million people in Northern and Central California. For more information, visit:  http://www.pge.com/about/newsroom/ and www.pgecurrents.com.

SOURCE Pacific Gas and Electric Company (PG&E)

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...