|By PR Newswire||
|December 18, 2012 01:45 PM EST||
SAN FRANCISCO, Dec. 18, 2012 /PRNewswire/ -- Building on its commitment to natural gas pipeline safety, Pacific Gas and Electric Company (PG&E) announced that it validated the safe operating pressure of an additional 194 miles of natural gas transmission pipeline in 2012, through hydrostatic pressure testing and rigorous records validation.
These activities were completed in areas throughout Northern and Central California as part of PG&E's Pipeline Safety Enhancement Program (PSEP) and through regularly scheduled projects to verify the integrity of natural gas transmission lines—the large-diameter pipes that carry gas across long distances. Since PSEP projects launched in 2011, PG&E has successfully tested a total of 409 miles of gas transmission pipeline.
PG&E also started construction on a state-of-the-art gas control center at its new Gas Operations headquarters at Bishop Ranch in San Ramon, Calif. The center will serve as a central location from which PG&E will monitor the safe and reliable operation of its 6,700 miles of transmission pipeline and 42,000 miles of smaller-diameter distribution mains. Completion of the gas control center is expected in Q2 2013.
"There is nothing more important than working to make our natural gas system safer in the communities we serve. We've made tremendous progress, but the work we've done is just the beginning," said Nick Stavropoulos, executive vice president of Gas Operations for PG&E. "Our goal is to operate the safest and most reliable natural gas system in the country."
Hydrostatic pressure testing is one of several important measures PG&E is taking to enhance the safety and strength of its natural gas system. Through the end of 2014, phase one of the PSEP program, the utility plans to pressure test or validate 783 miles of gas transmission pipeline, replace 185 miles of pipeline, automate more than 220 valves, and upgrade nearly 200 miles of pipeline to accommodate advanced in-line inspection tools known as "smart pigs."
"PSEP is a multi-year program that will enhance the safety and reliability of our natural gas transmission pipelines in communities throughout our service area. It will help us to assess our pipeline system and improve the delivery of safe, reliable and affordable natural gas to our customers," said Jesus Soto Jr., senior vice president of Gas Transmission Operations for PG&E.
This year, in addition to validating the safe operating pressure of 194 miles of pipeline, PG&E has made significant progress in improving transmission pipeline safety and reliability including:
- Installing more than 34 miles of new transmission pipeline in urban areas.
- Installing 37 automated valves in urban or active seismic fault crossing areas to allow for remote or automated shut-off in the event of a rupture. Fifty valves have been completed since the program commenced in 2011.
- Retrofitting 78 miles of pipeline to accommodate state-of-the-art in-line inspection tools known as "smart pigs."
In 2013, the utility plans to further increase its pipeline safety actions, with plans to strength test more than 200 miles of pipeline, replace more than 60 miles of pipeline and automate 75 valves – all in addition to work completed to date.
To learn more about how PG&E conducts hydrostatic pressure testing and what to expect if testing is planned for your area, visit www.pge.com/gas.
Pacific Gas and Electric Company, a subsidiary of PG&E Corporation (NYSE:PCG), is one of the largest combined natural gas and electric utilities in the United States. Based in San Francisco, with 20,000 employees, the company delivers some of the nation's cleanest energy to 15 million people in Northern and Central California. For more information, visit: http://www.pge.com/about/newsroom/ and www.pgecurrents.com.
SOURCE Pacific Gas and Electric Company (PG&E)
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Dec. 5, 2016 07:45 PM EST Reads: 2,205
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
Dec. 5, 2016 07:45 PM EST Reads: 2,074
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
Dec. 5, 2016 07:15 PM EST Reads: 2,045
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effici...
Dec. 5, 2016 07:15 PM EST Reads: 5,030
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Dec. 5, 2016 07:15 PM EST Reads: 344
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Dec. 5, 2016 07:00 PM EST Reads: 1,803
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 5, 2016 07:00 PM EST Reads: 1,837
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 5, 2016 06:15 PM EST Reads: 1,547
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
Dec. 5, 2016 05:45 PM EST Reads: 1,591
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Dec. 5, 2016 05:15 PM EST Reads: 1,684
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Dec. 5, 2016 04:30 PM EST Reads: 2,050
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Dec. 5, 2016 04:30 PM EST Reads: 871
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Dec. 5, 2016 04:15 PM EST Reads: 857
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Dec. 5, 2016 04:00 PM EST Reads: 2,551
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 03:45 PM EST Reads: 4,244