|By PR Newswire||
|January 29, 2014 01:56 PM EST||
SAN FRANCISCO, Jan. 29, 2014 /PRNewswire/ -- PG&E announced today that, in 2013, the 15 million people served by the utility experienced the fewest minutes without electricity in company history. The just-released numbers for 2013 show that PG&E continues to make significant progress in safely delivering reliable service that benefits homes and businesses throughout northern and central California.
In 2013, not only did the average duration of a service interruption for a PG&E customer fall to an all-time low, but customers also experienced the fewest service interruptions in company history. Customers have seen a 40 percent improvement in the average duration of a service disruption and a 27 percent improvement in the number of customer interruptions since 2006.
"Thanks to the significant work that has been done to upgrade and modernize electric facilities throughout our service area, PG&E's customers are experiencing the most reliable service in our company's history," said Geisha Williams, executive vice president of Electric Operations for PG&E. "We are committed to build on this success, to further strengthen our operations to provide our customers with the safe, reliable and affordable electric service they expect and deserve."
PG&E and other electric utilities measure the overall reliability of their systems using two primary indices defined by the Institute of Electrical and Electronic Engineers (IEEE). The System Average Interruption Duration Index (SAIDI) measures the number of minutes over the year that the average customer is without power. The average PG&E customer was without power for 117 minutes in all of 2013, a reduction from 196 minutes in 2006. The System Average Interruption Frequency Index (SAIFI) measures the system-wide frequency of power interruptions per customer. The average customer experienced 1.07 power interruptions in 2013, compared to 1.46 in 2006.
In order to ensure that customers receive the safest, most reliable and affordable service possible, the company is focused on continuous improvement. These improvements in service were due in part to the utility's investments in several key projects, including:
- Targeted Circuit Program. In 2013, PG&E crews targeted 75 circuits based on their history of outages. Crews strengthened the circuits and used infrared technology to identify potential trouble spots so that stressed equipment could be repaired or replaced before it failed. PG&E upgraded more than 330 circuits over the past five years.
- Intelligent Switches. This Smart Grid technology reduces the amount of time it takes to restore power to customers. Instead of waiting for a crew to arrive on scene to restore circuits manually, the new devices do it automatically, often within minutes. Utility workers installed automated "intelligent" switches on 392 circuits last year. In total, more than 500 circuits have been enabled with this advanced technology that benefits customers.
- Rural Circuit Upgrades. PG&E installed more than 7,000 sets of fuses and 700 line reclosers on more than 500 of the worst-performing rural circuits since 2010 to isolate service interruptions and minimize their impact on customers.
- Substation Upgrades. Technicians have replaced and upgraded substation equipment to handle an increase in demand, to improve equipment performance or to maintain or restore service when electricity needs to be rerouted.
- Vegetation Management Reliability Program. Crews worked to keep our electric lines free from trees and brush, helping to ensure the safe and reliable delivery of service to our customers. In the past five years, this program has reduced vegetation-related outages by 51 percent.
Pacific Gas and Electric Company, a subsidiary of PG&E Corporation (NYSE:PCG), is one of the largest combined natural gas and electric utilities in the United States. Based in San Francisco with 21,000 employees, the company delivers some of the nation's cleanest energy to 15 million people in Northern and Central California. For more information about PG&E, visit www.pge.com/about/newsroom/ or www.pgecurrents.com.
SOURCE Pacific Gas and Electric Company (PG&E)
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
Dec. 3, 2016 11:00 PM EST Reads: 4,115
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 3, 2016 11:00 PM EST Reads: 937
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Dec. 3, 2016 09:30 PM EST Reads: 1,596
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 3, 2016 09:30 PM EST Reads: 1,758
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
Dec. 3, 2016 08:00 PM EST Reads: 3,950
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Dec. 3, 2016 08:00 PM EST Reads: 1,742
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
Dec. 3, 2016 06:15 PM EST Reads: 1,509
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 3, 2016 05:30 PM EST Reads: 4,040
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Dec. 3, 2016 05:15 PM EST Reads: 2,136
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
Dec. 3, 2016 05:15 PM EST Reads: 1,998
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 3, 2016 04:30 PM EST Reads: 1,472
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Dec. 3, 2016 04:00 PM EST Reads: 4,859
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Dec. 3, 2016 03:30 PM EST Reads: 1,591
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Dec. 3, 2016 03:15 PM EST Reads: 3,222
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Dec. 3, 2016 03:00 PM EST Reads: 477