Welcome!

News Feed Item

PG&E Affirms Commitment to Safety and Expresses Confidence Legal Process Ultimately Will Uphold Company Position

SAN FRANCISCO, July 30, 2014 /PRNewswire/ -- Pacific Gas & Electric Company today expressed confidence that the legal process will uphold its position that federal charges contained within a superseding indictment formally filed today are unwarranted.

The charges relate to 27 alleged violations of the federal Pipeline Safety Act as well as an allegation that the company attempted to obstruct a National Transportation Safety Board (NTSB) investigation following the 2010 explosion of a natural gas transmission line in San Bruno.

The superseding indictment, which replaces an indictment issued in April, was announced by the United States Attorney's Office (USAO) in San Francisco late yesterday and filed today.

In response, the company issued this statement:

"Based on all of the evidence we have seen to date and our review of the new indictment, we still do not believe that PG&E employees intentionally violated the federal Pipeline Safety Act, and that, even where mistakes were made, employees were acting in good faith to provide customers with safe and reliable energy."

"With respect to the allegation of obstruction, during the NTSB investigation PG&E responded to hundreds of questions and requests for information and documents from the NTSB on an expedited basis. In the one response questioned in the USAO charge, PG&E had submitted a cover sheet approval form mismatched to the wrong internal engineering document. PG&E corrected this error with a letter dated April 6, 2011. The NTSB published the letter on its accident investigation docket on September 30, 2011, and it has been publicly available since then. PG&E believes the letter is true and accurate and stands by it." 

"We are confident the legal process will ensure all of the facts are fully reviewed. In the meantime, we want all of our customers to know that we will stay focused on transforming this 100-plus-year-old natural gas system into the safest and most reliable in the country."

"San Bruno was a tragic accident. We've taken accountability and are deeply sorry. We have worked hard to do the right thing for victims, their families and the community, and we will continue to do so. We are absolutely committed to re-earning the trust of all of the people we are fortunate to serve every day."

The indictment also seeks to increase financial penalties associated with the charges. However, PG&E noted that the financial penalties would be moot if the government fails to prove its case.

Since the 2010 explosion, PG&E has worked hard to improve the natural gas system and make safety the foundation of its culture. Among the steps the company has taken: 

  • Change began at the top with Tony Earley joining the company as the CEO in 2011. We restructured our gas operations business and recruited the best natural gas experts in the country to run it.
  • In order to help ensure the safety of the existing pipeline system, we digitized records, conducted advanced pressure testing, replaced pipe where necessary and deployed 150 new automated or remotely controlled valves.
  • We built a new gas operations control center from which we can monitor the entire system and respond more quickly and effectively to emergencies. It employs the most advanced 21st century technology.
  • When a customer calls to report a gas odor, we are now among the fastest in the entire industry in responding, and we've adopted new gas leak detection technology that is 1,000 times more sensitive than before in order to help find and fix leaks before they become a problem.
  • We put 3,500 leaders at all levels of PG&E through safety training and we review the lessons of San Bruno with every new employee we hire as we work each and every day to put safety first.
  • We recently became one of the first utilities in the world to earn two of the highest safety certifications – the International Organization for Standardization (ISO) 55001 and Publicly Available Specification (PAS) 55. These stringent certifications must be re-earned every year.

The company has settled claims amounting to more than $500 million with the victims and families of the San Bruno accident, established a $50 million trust for the City of San Bruno for costs related to recovery and contributed $70 million to support the city's and community's recovery efforts.

For additional information and ongoing updates on this issue, please visit  www.PGEresponds.com.

About PG&E

Pacific Gas and Electric Company, a subsidiary of PG&E Corporation (NYSE:PCG), is one of the largest combined natural gas and electric utilities in the United States. Based in San Francisco, with more than 20,000 employees, the company delivers some of the nation's cleanest energy to nearly 16 million people in Northern and Central California. For more information, visit www.pge.com/ and http://www.pge.com/about/newsroom/.

SOURCE Pacific Gas and Electric Company (PG&E)

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.