|By Gil Allouche||
|February 24, 2014 05:00 PM EST||
Unless you are an IT manager, understanding all of the technical details of how big data technology — such as Hadoop Hive or Apache Pig as a Service — works isn’t as important as understanding how big data can be used to improve lives. Public safety, whether it’s reducing crime rates or maintaining roads and bridges, is a huge undertaking with serious consequences when mistakes are made. Big data can provide additional insights (such as the connection between home games and crime rates discussed below) to governments and businesses to help them keep cities safe, and better respond to disasters when they strike.
The U.S. Department of Transportation keeps track of more than 600,000 bridges, all of which are more than 20 feet long and used for vehicular traffic. Of those bridges, 25 percent have been classified as deficient. Unfortunately, repairing bridges is an expensive endeavor costing billions of dollars, and until recently, the process of inspecting bridges for problems was slow and inaccurate (since each bridge had to be inspected manually). Occasionally even small mistakes can lead to big consequences, such as when a bridge collapsed in Minneapolis, killing 13 people and injuring 145.
Now, governments are taking advantage of big data to help them better track which bridges need maintenance in order to keep the bridges safe and cut down on inspection time. By placing sensors in certain areas of a bridge, it is possible to track how certain factors are affecting the structural integrity of the bridge. For example, sensors could track everything from how seismic activity or a heavy winter storm has had an impact on the structure, all the way down to the day to day wear and tear from traffic. These sensors allow all data to be collected in a central location without sending an inspector out to each individual bridge, and for crucial data to be analyzed in realtime, so serious problems can be addressed faster. This data is also useful for distributing a limited budget, so the more important issues are addressed first.
Cracking Down on Crime
When do crime rates tend to peak in a particular neighborhood? What streets, businesses or events tend to attract higher crime rates? Which areas of the city need a higher police presence? For those familiar with their neighborhood and city, the answer to these questions may seem obvious, but for police agencies in charge of several neighborhoods, getting a clear picture of what is happening on every street can be difficult. With access to more data, however, such as historical crime records, presence of graffiti and the location of police stations, cities can better distribute their resources to reduce crime.
In California, some high school students completed a big data project analyzing crime rates in San Francisco. After going through millions of data points, the students found some interesting insights. For instance, there were 10 liquor stores, all within a 2-mile radius of each other, where a huge majority of crimes occur. A particular area, the 800 block of Bryant street, averages a single crime every three hours, and whenever there is a home game of the Giants or 49ers, there is a 20 percent spike in crime. The spike tended to be higher if the game was on the weekend and if the home team won vs. a weekday game or a loss or tie.
Even though this project was limited to publicly available data, it serves as a good example of what cities and police stations with much more extensive data sources could do by analyzing big data.
Boost Utility Reliability
When natural disasters hit, countless citizens are often left without power. If these storms happen to hit during the winter months, the loss of power can be particularly devastating, as residents are left without heat. Even smaller storms can cause serious consequences when traffic lights stop working, and the elderly or small children are left without heat. Due to this, utility companies have started analyzing the data they collect to improve response times to power outages and to better distribute resources when a larger storm strikes.
With so much data readily available, it makes sense for utility companies, departments and cities to start using that data to improve public safety, and hopefully cut costs and improve efficiency in the long run.
Image Source: Wikimedia
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Aug. 30, 2016 05:00 PM EDT Reads: 890
Aug. 30, 2016 04:15 PM EDT Reads: 954
Aug. 30, 2016 03:30 PM EDT Reads: 3,767
Aug. 30, 2016 03:00 PM EDT Reads: 846
Aug. 30, 2016 02:30 PM EDT Reads: 950
Aug. 30, 2016 02:30 PM EDT Reads: 1,033
Aug. 30, 2016 02:00 PM EDT Reads: 3,762
Aug. 30, 2016 01:45 PM EDT Reads: 649
Aug. 30, 2016 01:30 PM EDT Reads: 866
Aug. 30, 2016 01:20 PM EDT Reads: 254
Aug. 30, 2016 01:15 PM EDT Reads: 2,065
Aug. 30, 2016 01:00 PM EDT Reads: 3,231
Aug. 30, 2016 11:45 AM EDT Reads: 800
Aug. 30, 2016 10:30 AM EDT Reads: 413
Aug. 30, 2016 09:45 AM EDT Reads: 887