Welcome!

Related Topics: @BigDataExpo, Java IoT, Linux Containers, @CloudExpo, Cloud Security

@BigDataExpo: Blog Post

How Big Data Can Boost Public Safety

When the 49ers Win, Crime Rates Go Up

Unless you are an IT manager, understanding all of the technical details of how big data technology — such as Hadoop Hive or Apache Pig as a Service — works isn’t as important as understanding how big data can be used to improve lives. Public safety, whether it’s reducing crime rates or maintaining roads and bridges, is a huge undertaking with serious consequences when mistakes are made. Big data can provide additional insights (such as the connection between home games and crime rates discussed below) to governments and businesses to help them keep cities safe, and better respond to disasters when they strike.

Bridge Maintenance
The U.S. Department of Transportation keeps track of more than 600,000 bridges, all of which are more than 20 feet long and used for vehicular traffic. Of those bridges, 25 percent have been classified as deficient. Unfortunately, repairing bridges is an expensive endeavor costing billions of dollars, and until recently, the process of inspecting bridges for problems was slow and inaccurate (since each bridge had to be inspected manually). Occasionally even small mistakes can lead to big consequences, such as when a bridge collapsed in Minneapolis, killing 13 people and injuring 145.

Now, governments are taking advantage of big data to help them better track which bridges need maintenance in order to keep the bridges safe and cut down on inspection time. By placing sensors in certain areas of a bridge, it is possible to track how certain factors are affecting the structural integrity of the bridge. For example, sensors could track everything from how seismic activity or a heavy winter storm has had an impact on the structure, all the way down to the day to day wear and tear from traffic. These sensors allow all data to be collected in a central location without sending an inspector out to each individual bridge, and for crucial data to be analyzed in realtime, so serious problems can be addressed faster. This data is also useful for distributing a limited budget, so the more important issues are addressed first.

Cracking Down on Crime
When do crime rates tend to peak in a particular neighborhood? What streets, businesses or events tend to attract higher crime rates? Which areas of the city need a higher police presence? For those familiar with their neighborhood and city, the answer to these questions may seem obvious, but for police agencies in charge of several neighborhoods, getting a clear picture of what is happening on every street can be difficult. With access to more data, however, such as historical crime records, presence of graffiti and the location of police stations, cities can better distribute their resources to reduce crime.

In California, some high school students completed a big data project analyzing crime rates in San Francisco. After going through millions of data points, the students found some interesting insights. For instance, there were 10 liquor stores, all within a 2-mile radius of each other, where a huge majority of crimes occur. A particular area, the 800 block of Bryant street, averages a single crime every three hours, and whenever there is a home game of the Giants or 49ers, there is a 20 percent spike in crime. The spike tended to be higher if the game was on the weekend and if the home team won vs. a weekday game or a loss or tie.

Even though this project was limited to publicly available data, it serves as a good example of what cities and police stations with much more extensive data sources could do by analyzing big data.

Boost Utility Reliability
When natural disasters hit, countless citizens are often left without power. If these storms happen to hit during the winter months, the loss of power can be particularly devastating, as residents are left without heat. Even smaller storms can cause serious consequences when traffic lights stop working, and the elderly or small children are left without heat. Due to this, utility companies have started analyzing the data they collect to improve response times to power outages and to better distribute resources when a larger storm strikes.

With so much data readily available, it makes sense for utility companies, departments and cities to start using that data to improve public safety, and hopefully cut costs and improve efficiency in the long run.

Image Source: Wikimedia

More Stories By Gil Allouche

Gil Allouche is the Vice President of Marketing at Qubole. Most recently Sr. Director of Marketing for Karmasphere, a leading Big Data Analytics company offering SQL access to Apache Hadoop, where he managed all marketing functions, Gil brings a keen understanding of the Big Data target market and its technologies and buyers. Prior to Karmasphere, Gil was a product marketing manager and general manager for the TIBCO Silver Spotfire SaaS offering where he developed and executed go-to-market plans that increased growth by 600 percent in just 18 months. Gil also co-founded 1Yell, a social media ad network company. Gil began his marketing career as a product strategist at SAP while earning his MBA at Babson College and is a former software engineer.

Latest Stories
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...