Welcome!

Related Topics: @BigDataExpo, Java IoT, Linux Containers, @CloudExpo, Cloud Security

@BigDataExpo: Blog Post

How Big Data Can Boost Public Safety

When the 49ers Win, Crime Rates Go Up

Unless you are an IT manager, understanding all of the technical details of how big data technology — such as Hadoop Hive or Apache Pig as a Service — works isn’t as important as understanding how big data can be used to improve lives. Public safety, whether it’s reducing crime rates or maintaining roads and bridges, is a huge undertaking with serious consequences when mistakes are made. Big data can provide additional insights (such as the connection between home games and crime rates discussed below) to governments and businesses to help them keep cities safe, and better respond to disasters when they strike.

Bridge Maintenance
The U.S. Department of Transportation keeps track of more than 600,000 bridges, all of which are more than 20 feet long and used for vehicular traffic. Of those bridges, 25 percent have been classified as deficient. Unfortunately, repairing bridges is an expensive endeavor costing billions of dollars, and until recently, the process of inspecting bridges for problems was slow and inaccurate (since each bridge had to be inspected manually). Occasionally even small mistakes can lead to big consequences, such as when a bridge collapsed in Minneapolis, killing 13 people and injuring 145.

Now, governments are taking advantage of big data to help them better track which bridges need maintenance in order to keep the bridges safe and cut down on inspection time. By placing sensors in certain areas of a bridge, it is possible to track how certain factors are affecting the structural integrity of the bridge. For example, sensors could track everything from how seismic activity or a heavy winter storm has had an impact on the structure, all the way down to the day to day wear and tear from traffic. These sensors allow all data to be collected in a central location without sending an inspector out to each individual bridge, and for crucial data to be analyzed in realtime, so serious problems can be addressed faster. This data is also useful for distributing a limited budget, so the more important issues are addressed first.

Cracking Down on Crime
When do crime rates tend to peak in a particular neighborhood? What streets, businesses or events tend to attract higher crime rates? Which areas of the city need a higher police presence? For those familiar with their neighborhood and city, the answer to these questions may seem obvious, but for police agencies in charge of several neighborhoods, getting a clear picture of what is happening on every street can be difficult. With access to more data, however, such as historical crime records, presence of graffiti and the location of police stations, cities can better distribute their resources to reduce crime.

In California, some high school students completed a big data project analyzing crime rates in San Francisco. After going through millions of data points, the students found some interesting insights. For instance, there were 10 liquor stores, all within a 2-mile radius of each other, where a huge majority of crimes occur. A particular area, the 800 block of Bryant street, averages a single crime every three hours, and whenever there is a home game of the Giants or 49ers, there is a 20 percent spike in crime. The spike tended to be higher if the game was on the weekend and if the home team won vs. a weekday game or a loss or tie.

Even though this project was limited to publicly available data, it serves as a good example of what cities and police stations with much more extensive data sources could do by analyzing big data.

Boost Utility Reliability
When natural disasters hit, countless citizens are often left without power. If these storms happen to hit during the winter months, the loss of power can be particularly devastating, as residents are left without heat. Even smaller storms can cause serious consequences when traffic lights stop working, and the elderly or small children are left without heat. Due to this, utility companies have started analyzing the data they collect to improve response times to power outages and to better distribute resources when a larger storm strikes.

With so much data readily available, it makes sense for utility companies, departments and cities to start using that data to improve public safety, and hopefully cut costs and improve efficiency in the long run.

Image Source: Wikimedia

More Stories By Gil Allouche

Gil Allouche is the Vice President of Marketing at Qubole. Most recently Sr. Director of Marketing for Karmasphere, a leading Big Data Analytics company offering SQL access to Apache Hadoop, where he managed all marketing functions, Gil brings a keen understanding of the Big Data target market and its technologies and buyers. Prior to Karmasphere, Gil was a product marketing manager and general manager for the TIBCO Silver Spotfire SaaS offering where he developed and executed go-to-market plans that increased growth by 600 percent in just 18 months. Gil also co-founded 1Yell, a social media ad network company. Gil began his marketing career as a product strategist at SAP while earning his MBA at Babson College and is a former software engineer.

Latest Stories
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
"We are a leader in the market space called network visibility solutions - it enables monitoring tools and Big Data analysis to access the data and be able to see the performance," explained Shay Morag, VP of Sales and Marketing at Niagara Networks, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"We analyze the video streaming experience. We are gathering the user behavior in real time from the user devices and we analyze how users experience the video streaming," explained Eric Kim, Founder and CEO at Streamlyzer, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, provideed economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session also included a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.