News Feed Item

Geospatial Corporation's Smart Probe Device Accurately Maps Two New Natural Gas Pipelines in Texas

Identifies Exact Position of Very Deep Pipelines Underneath Waterways

PITTSBURGH, PA--(Marketwired - September 04, 2014) - Geospatial Corporation (PINKSHEETS: GSPH), on a recent mapping project, accurately mapped two new pipelines recently installed under multiple waterways -- Copperas Branch and Lake Lewisville in Texas, just outside of Dallas.

Geospatial Corporation provided accurate 3-dimensional pipeline mapping utilizing a proprietary system. Geospatial's Smart Probe, allows the company to map various types of conduits and pipelines, no matter the material -- metallic or non-metallic, size or depth. The Smart Probe is ideal for mapping very deep pipelines under water as it doesn't require a data tether or any communication with the surface. 

Mark Smith, Geospatial's CEO stated, "Geospatial now has the technologies to map most underground and underwater pipelines. Our growing data acquisition technologies, our continued enhancements to GeoUnderground, and the industry's rapid acceptance of cloud based information management platforms, positions Geospatial to emerge as a key player within the underground infrastructure management industry."

The first section of pipeline in Texas measured approximately 2100-feet with a maximum depth of approximately 80-feet. The second section measured approximately 5,600-feet with a depth of more than 120-feet. After completion of the data-collection, Geospatial was able to verify consistency of the mapping results and the findings were reported via GeoUnderground for effective and adaptable use of the deliverable.

GeoUnderground, designed around Google Maps Engine and the Google Maps API, is Geospatial's cloud-based GIS platform that provides clients with a total solution to their underground asset management needs.

About Geospatial Corporation

Geospatial Corporation utilizes integrated technologies to determine the accurate location and position of underground pipelines, conduits and other underground infrastructure data allowing Geospatial to create accurate (3D) three-dimensional digital maps and models of underground infrastructure. Our website is www.GeospatialCorporation.com. The Company manages this critical infrastructure data on its Cloud-Based GIS Portal called GeoUnderground, our proprietary GIS platform custom designed around the Google Maps API and Google Maps Engine. (www.GeoUnderground.com).

GeoUnderground is the company's powerful Cloud-Based (GIS) geographic information system database that enables users to view and utilize this 3D pipeline mapping information securely from any desktop or mobile device. GeoUnderground seamlessly integrates with Geospatial's technologies gathering underground, underwater or aboveground geo-referenced digital information of all types of infrastructure.

Licensed users, for the first time, have available to them a suite of technologies allowing them to collect data and create highly accurate 3D maps and models of both above-ground and below-ground infrastructure and view and share this invaluable information in a secure manner with their peers and associates anywhere in the world through a conventional browser via the Cloud.

Geospatial provides integrated data acquisition technologies that accurately locate and map underground and above ground infrastructure assets such as pipelines and surface features via its GeoUnderground Cloud-Based Portal. 

Forward looking statements

This press release contains forward-looking information within the meaning of the Private Securities Litigation Reform Act of 1995, Section 27A of the Securities Act of 1993 and Section 21E of the Securities Exchange Act of 1934 and is subject to the safe harbor created by those laws. These forward-looking statements, if any, are based upon a number of assumptions and estimates that are subject to significant uncertainties that involve known and unknown risks, many of which are beyond our control and are not guarantees of future performance. Actual outcomes and results could materially differ from what is expressed, implied, or forecasted in any such forward-looking statements and any such difference may be caused by risk factors listed from time to time in the Company's news releases and/or its filings or as a result of other factors.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
DevOps theory promotes a culture of continuous improvement built on collaboration, empowerment, systems thinking, and feedback loops. But how do you collaborate effectively across the traditional silos? How can you make decisions without system-wide visibility? How can you see the whole system when it is spread across teams and locations? How do you close feedback loops across teams and activities delivering complex multi-tier, cloud, container, serverless, and/or API-based services?
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
In the 21st century, security on the Internet has become one of the most important issues. We hear more and more about cyber-attacks on the websites of large corporations, banks and even small businesses. When online we’re concerned not only for our own safety but also our privacy. We have to know that hackers usually start their preparation by investigating the private information of admins – the habits, interests, visited websites and so on. On the other hand, our own security is in danger bec...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Enterprises have been using both Big Data and virtualization for years. Until recently, however, most enterprises have not combined the two. Big Data's demands for higher levels of performance, the ability to control quality-of-service (QoS), and the ability to adhere to SLAs have kept it on bare metal, apart from the modern data center cloud. With recent technology innovations, we've seen the advantages of bare metal erode to such a degree that the enhanced flexibility and reduced costs that cl...
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his session at @DevOpsSummit 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will show how customers are able to achieve a level of transparency that enables everyon...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, will discuss how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team a...
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Big Data has been changing the world. IoT fuels the further transformation recently. How are Big Data and IoT related? In his session at @BigDataExpo, Tony Shan, a renowned visionary and thought leader, will explore the interplay of Big Data and IoT. He will anatomize Big Data and IoT separately in terms of what, which, why, where, when, who, how and how much. He will then analyze the relationship between IoT and Big Data, specifically the drilldown of how the 4Vs of Big Data (Volume, Variety,...