Welcome!

News Feed Item

PG&E Seeks Reasonable, Proportionate CPUC Penalty That Takes Into Account Investments And Advances In Safety Since San Bruno Explosion

SAN FRANCISCO, Sept. 2, 2014 /PRNewswire/ -- Pacific Gas and Electric Company (PG&E) today said that a penalty being considered by the California Public Utilities Commission (CPUC) in connection with the 2010 explosion of a natural gas transmission pipeline in San Bruno should be reasonable and take into account precedent and the investments the company has made to promote safety.

Today, the CPUC's administrative law judges released a recommended penalty resulting from the investigations. While the CPUC characterized the penalty as totaling approximately $2 billion, PG&E believes that the total shareholder impact could reach approximately $4.75 billion, including the previous $2.7 billion in estimated costs that shareholders have incurred or are forecast to incur, to improve and enhance the safety of PG&E's natural gas operations. The ultimate amount of costs will depend on the scope and timing of work and other factors, many of which are described in PG&E Corporation's and PG&E's recent Securities and Exchange Commission reports. It's likely that the CPUC could take a minimum of 45 days to reach a final decision.

PG&E Corporation Chairman, CEO and President Tony Earley said:

"Since the 2010 explosion of our natural gas transmission pipeline in San Bruno, we've been dedicated to re-earning the trust of our customers and the communities we serve. We are deeply sorry for this tragic event.

"We are accountable and fully accept that a penalty of some kind is appropriate. However, we have respectfully asked that the Commission ensure that the penalty is reasonable and proportionate and takes into consideration the company's investments and actions to promote safety. Moreover, we believe any penalty should directly benefit public safety.

"We've worked hard to do the right thing for the victims, their families and the community of San Bruno. Beyond this, all of us at PG&E have committed ourselves to a goal to transform this company into the safest and most reliable energy provider in America. We've hired some of the best gas experts in the country to help guide this effort and supported it with billions of dollars in shareholder funding.

"We have made tremendous progress but we're not done. We have more work to do and we won't rest until it's done and done right."

Here are just a few of the concrete actions the company has taken to make safety the cornerstone of its culture:

  • Change began at the top with Tony Earley joining the company as CEO in 2011. We restructured our gas operations business and hired the best natural gas experts in the country to run it.
  • We put 3,500 leaders at all levels of PG&E through safety training and we review the lessons of San Bruno with every new employee.
  • We have conducted advanced pipeline safety testing, replaced pipe where necessary and installed 150 new automated or remotely controlled emergency shut-off valves.
  • We built a new gas operations control center from which we can monitor the entire system and respond more quickly and effectively to emergencies. It employs the most advanced technology.
  • We're using new gas leak detection technology that is 1,000 times more sensitive than before in order to help find and fix leaks before they become a problem. When a customer calls to report a gas odor, we are now among the fastest in the entire industry in responding.

As a result of these and many other efforts, PG&E recently became one of the first utilities ever to earn two of the highest internationally recognized safety certifications—the International Organization for Standardization (ISO) 55001 and Publicly Available Specification (PAS) 55-1. These stringent certifications must be re-earned every year.

The company has settled claims amounting to more than $500 million with the victims and families of the San Bruno accident, established a $50 million trust for the City of San Bruno for costs related to recovery and contributed $70 million to support the city's and community's recovery efforts.

About PG&E

Pacific Gas and Electric Company, a subsidiary of PG&E Corporation (NYSE:PCG), is one of the largest combined natural gas and electric utilities in the United States. Based in San Francisco, with 20,000 employees, the company delivers some of the nation's cleanest energy to nearly 16 million people in Northern and Central California. For more information, visit http://www.pge.com/about/newsroom/ and www.pgecurrents.com.

SOURCE Pacific Gas and Electric Company

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
Translating agile methodology into real-world best practices within the modern software factory has driven widespread DevOps adoption, yet much work remains to expand workflows and tooling across the enterprise. As models evolve from pockets of experimentation into wholescale organizational reinvention, practitioners find themselves challenged to incorporate the culture and architecture necessary to support DevOps at scale.
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically abo...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...