Welcome!

News Feed Item

Trimble Introduces State-of-the-Art Integrated Seismic and Geodetic System for Earth Sciences and Infrastructure Monitoring Applications

ISTANBUL, Aug. 25, 2014 /PRNewswire/ -- Trimble (NASDAQ:TRMB) introduced today an integrated Global Navigation Satellite System (GNSS) reference receiver, broadband seismic recorder and a force-balance triaxial accelerometer for infrastructure and precise scientific applications―the Trimble® SG160-09 SeismoGeodetic system. The SG160-09 provides real-time GNSS positioning and seismic data for earthquake early warning and volcano monitoring as well as infrastructure monitoring for buildings, bridges, dams as well as other natural and manmade structures.

The announcement was made at the Second European Conference on Earthquake Engineering and Seismology (2ECEES) in Istanbul, Turkey.

The Trimble SG160-09 SeismoGeodetic system combines the innovation, reliability and data integrity of both the Trimble and REF TEK brands into a single instrument. The system integrates seismic recording with GNSS geodetic measurement in a single compact, ruggedized package. It includes a low-power, 220-channel GNSS receiver powered by the latest Trimble-precise Maxwell™ 6 technology and supports tracking of both GPS and GLONASS signals plus the Galileo E1 frequency.

The system includes both the SG160-09 and utilization of Trimble's CenterPoint™ RTX™ correction service, which provides on-board GNSS point positioning. Based on Trimble RTX technology, the service utilizes satellite clock and orbit information delivered over cellular networks or Internet Protocol (IP), allowing cm-level position displacement tracking in real-time anywhere in the world. The SG160-09 system will be available for purchase without the RTX correction service for those applications using real-time kinematic (RTK) positioning.

The seismic recording sensor includes an ANSS Class A, low-noise, force-balance triaxial accelerometer with the latest, low-power, 24-bit A/D converter, which produces high-resolution seismic data. The internally built accelerometer has +/- 4g full scale output, large linear range, high resolution and sensitivity, which makes it ideal for both portable and permanent deployment. The SG160-09 processor acquires and packetizes both seismic and geodetic data and transmits it to system operators using an advanced, error-correction protocol with back-fill capability providing data integrity between the field and the processing center.

The SG160-09 system is ideal for earthquake early warning studies and other hazard mitigation applications, such as volcano monitoring, building, bridge and dam monitoring systems. The SG160-09 system features a variable size industrial grade USB drive to support real-time telemetry data transmission. In the event of a telemetry link outage, the data is stored on the USB drive and can be re-transmitted to the centralized processing station as soon as the communication link comes back up, allowing no data loss during the system operation.

The Trimble SG160-09 system is optimized for field use with instrument mounted or externally mounted GNSS antenna configurations. The lightweight yet rugged SG160-09 consumes very little power and can be used for projects with remote connectivity and in extreme weather conditions. Because the SG160-09 combines both GNSS and strong motion in a single instrument, site installation time is reduced, data communications flow through a single pathway, and station power infrastructure is streamlined, making the SG160-09 a cost competitive solution compared to other systems on the market today. It has an IP67 rating, which means it is sealed against dust and can be submerged in water up to a meter for approximately 30 minutes. The SG160-09 also meets MIL-STD 810F standard for drops, vibration and temperature extremes.

"The SG160-09 is another example of Trimble's on-going focus in GNSS and seismic technology for the scientific and engineering communities," said Ulrich Vollath, general manager for Trimble's Infrastructure Division. "Trimble has developed a combined state-of-the-art GNSS receiver with a high-dynamic range, low-noise accelerometer that provides dynamic monitoring with the flexibility required for today and tomorrow's challenges."

The Trimble SG160-09 SeismoGeodetic system is expected to be available in the fourth quarter of 2014. For more information, visit:  www.trimble.com, call 1-800-767-4822 (U.S. only), +1-303-323-4111 (outside of the U.S.) or email:  [email protected]

About Trimble

Trimble applies technology to make field and mobile workers in businesses and government significantly more productive. Solutions are focused on applications requiring position or location—including surveying, construction, agriculture, fleet and asset management, public safety and mapping. In addition to utilizing positioning technologies, such as GPS, lasers and optics, Trimble solutions may include software content specific to the needs of the user. Wireless technologies are utilized to deliver the solution to the user and to ensure a tight coupling of the field and the back office. Founded in 1978, Trimble is headquartered in Sunnyvale, Calif.

For more information, visit:  www.trimble.com.

GTRMB

SOURCE Trimble

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...