Welcome!

News Feed Item

Geospatial Corporation, a Google Cloud Partner - Technology Track Announces Success Introducing Underground Pipeline Mapping, Management and Software to the Energy Industry

PITTSBURGH, Dec. 1, 2016 /PRNewswire/ -- Geospatial Corporation  (OTCQB symbol: "GSPH") announced today that the Company has successfully completed more than 50 pipeline mapping projects under its recently introduced Quality Assurance ("QA") Locational Integrity Management ("ILIM") program for major oil & gas companies across the Southeastern US.  Todd Porter, President of Geospatial's Energy group based in Houston, Texas stated, "Every energy company managing pipelines realizes the critical need to maintain accurate and complete information on these pipelines including location and depth.  While, as a first step, each of these initial projects are relatively small, we are now discussing the benefits of entering into long term, extensive facility mapping opportunities encompassing the mapping of large sections of underground pipelines and providing the data to the client on GeoUnderground."

The program allows for the accurate mapping of pipelines and conduits installed via Horizontal Directional Drilling ("HDD") or conventional trenching methods regardless of pipeline depth, type of material or soil conditions.  This service addresses the need for accurate 3D Mapping of critical pipeline segments that exceed regulatory requirements and supports integrity and reliability demands. The new requirements of PHMSA regulations under the Pipes Act of 2016 affect the entire 2.4 million miles of buried gas pipelines in the USA.

In addition to the energy industry, Geospatial Corporation and its cloud-based GeoUnderground software, provides a simple, secure and economical solution to organize and manage both above ground and belowground infrastructure by the 60,000 municipalities across the US. Geospatial helps municipalities and utility companies more accurately locate and map their underground pipelines, conduits and other underground assets through a portfolio of high tech data collection devices and highly trained technicians.

About Geospatial Corporation

Geospatial Corporation utilizes integrated technologies to determine the accurate location and position of underground pipelines, conduits and other underground infrastructure data allowing Geospatial to create accurate three-dimensional (3D) digital maps and models of underground infrastructure. Our website is www.GeospatialCorporation.com.  The Company manages this critical infrastructure data on its cloud-based GIS portal called GeoUnderground, our proprietary GIS platform custom designed around the Google Maps API.

Geospatial provides integrated data acquisition technologies that accurately locate and map underground and aboveground infrastructure assets such as pipelines and surface features via its GeoUnderground cloud-based portal. 

About GeoUnderground

GeoUnderground, designed around the Google Maps API, is Geospatial's cloud-based GIS platform that provides clients with a total solution to their underground and aboveground asset management needs. (www.GeoUnderground.com). Please feel free to download a free trial from this website.  Geospatial is a Google Cloud Partner – Technology Track.

GeoUnderground leverages Google Maps to deliver asset management capabilities in the cloud in an economical SAAS model allowing every municipality in the country to afford to digitally manage their infrastructure. GeoUnderground is secure and simple to use with little need for massive technological changes, training or development. You just use Google Maps in a new and effective way. Now "All Cities" big or small, can afford to be "Smart Cities."

Forward Looking Statements

This press release contains forward-looking information within the meaning of the Private Securities Litigation Reform Act of 1995, Section 27A of the Securities Act of 1993 and Section 21E of the Securities Exchange Act of 1934 and is subject to the safe harbor created by those laws. These forward-looking statements, if any, are based upon a number of assumptions and estimates that are subject to significant uncertainties that involve known and unknown risks, many of which are beyond our control and are not guarantees of future performance. Actual outcomes and results could materially differ from what is expressed, implied, or forecasted in any such forward-looking statements and any such difference may be caused by risk factors listed from time to time in the Company's news releases and/or its filings or as a result of other factors.

 

To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/geospatial-corporation-a-google-cloud-partner--technology-track-announces-success-introducing-underground-pipeline-mapping-management-and-software-to-the-energy-industry-300371088.html

SOURCE Geospatial Corporation

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...