Click here to close now.




















Welcome!

Related Topics: @BigDataExpo, Java IoT, Linux Containers, @CloudExpo, Cloud Security

@BigDataExpo: Blog Post

How Big Data Can Boost Public Safety

When the 49ers Win, Crime Rates Go Up

Unless you are an IT manager, understanding all of the technical details of how big data technology — such as Hadoop Hive or Apache Pig as a Service — works isn’t as important as understanding how big data can be used to improve lives. Public safety, whether it’s reducing crime rates or maintaining roads and bridges, is a huge undertaking with serious consequences when mistakes are made. Big data can provide additional insights (such as the connection between home games and crime rates discussed below) to governments and businesses to help them keep cities safe, and better respond to disasters when they strike.

Bridge Maintenance
The U.S. Department of Transportation keeps track of more than 600,000 bridges, all of which are more than 20 feet long and used for vehicular traffic. Of those bridges, 25 percent have been classified as deficient. Unfortunately, repairing bridges is an expensive endeavor costing billions of dollars, and until recently, the process of inspecting bridges for problems was slow and inaccurate (since each bridge had to be inspected manually). Occasionally even small mistakes can lead to big consequences, such as when a bridge collapsed in Minneapolis, killing 13 people and injuring 145.

Now, governments are taking advantage of big data to help them better track which bridges need maintenance in order to keep the bridges safe and cut down on inspection time. By placing sensors in certain areas of a bridge, it is possible to track how certain factors are affecting the structural integrity of the bridge. For example, sensors could track everything from how seismic activity or a heavy winter storm has had an impact on the structure, all the way down to the day to day wear and tear from traffic. These sensors allow all data to be collected in a central location without sending an inspector out to each individual bridge, and for crucial data to be analyzed in realtime, so serious problems can be addressed faster. This data is also useful for distributing a limited budget, so the more important issues are addressed first.

Cracking Down on Crime
When do crime rates tend to peak in a particular neighborhood? What streets, businesses or events tend to attract higher crime rates? Which areas of the city need a higher police presence? For those familiar with their neighborhood and city, the answer to these questions may seem obvious, but for police agencies in charge of several neighborhoods, getting a clear picture of what is happening on every street can be difficult. With access to more data, however, such as historical crime records, presence of graffiti and the location of police stations, cities can better distribute their resources to reduce crime.

In California, some high school students completed a big data project analyzing crime rates in San Francisco. After going through millions of data points, the students found some interesting insights. For instance, there were 10 liquor stores, all within a 2-mile radius of each other, where a huge majority of crimes occur. A particular area, the 800 block of Bryant street, averages a single crime every three hours, and whenever there is a home game of the Giants or 49ers, there is a 20 percent spike in crime. The spike tended to be higher if the game was on the weekend and if the home team won vs. a weekday game or a loss or tie.

Even though this project was limited to publicly available data, it serves as a good example of what cities and police stations with much more extensive data sources could do by analyzing big data.

Boost Utility Reliability
When natural disasters hit, countless citizens are often left without power. If these storms happen to hit during the winter months, the loss of power can be particularly devastating, as residents are left without heat. Even smaller storms can cause serious consequences when traffic lights stop working, and the elderly or small children are left without heat. Due to this, utility companies have started analyzing the data they collect to improve response times to power outages and to better distribute resources when a larger storm strikes.

With so much data readily available, it makes sense for utility companies, departments and cities to start using that data to improve public safety, and hopefully cut costs and improve efficiency in the long run.

Image Source: Wikimedia

More Stories By Gil Allouche

Gil Allouche is the Vice President of Marketing at Qubole. Most recently Sr. Director of Marketing for Karmasphere, a leading Big Data Analytics company offering SQL access to Apache Hadoop, where he managed all marketing functions, Gil brings a keen understanding of the Big Data target market and its technologies and buyers. Prior to Karmasphere, Gil was a product marketing manager and general manager for the TIBCO Silver Spotfire SaaS offering where he developed and executed go-to-market plans that increased growth by 600 percent in just 18 months. Gil also co-founded 1Yell, a social media ad network company. Gil began his marketing career as a product strategist at SAP while earning his MBA at Babson College and is a former software engineer.

Latest Stories
In their session at 17th Cloud Expo, Hal Schwartz, CEO of Secure Infrastructure & Services (SIAS), and Chuck Paolillo, CTO of Secure Infrastructure & Services (SIAS), provide a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. In his role as CEO of Secure Infrastructure & Services (SIAS), Hal Schwartz provides leadership and direction for the company.
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, analyzed a range of cloud offerings (IaaS, PaaS, SaaS) and discussed the benefits/challenges of migrating to each offe...
SYS-CON Events announced today that the "Second Containers & Microservices Expo" will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 17th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships at Com...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Scrum Alliance has announced the release of its 2015 State of Scrum Report. Almost 5,000 individuals and companies worldwide participated in this year's survey. Most organizations in the market today are still leading and managing under an Industrial Age model. Not only is the speed of change growing exponentially, Agile and Scrum frameworks are showing companies how to draw on the full talents and capabilities of those doing the work in order to continue innovating for success.
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobi...
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at @DevOpsSummit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, presented a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mocku...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Graylog, Inc., has added the capability to collect, centralize and analyze application container logs from within Docker. The Graylog logging driver for Docker addresses the challenges of extracting intelligence from within Docker containers, where most workloads are dynamic and log data is not persisted or stored. Using Graylog, DevOps and IT Ops teams can pinpoint the root cause of problems to deliver new applications faster and minimize downtime.
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacent...
Learn how you can use the CoSN SEND II Decision Tree for Education Technology to make sure that your K–12 technology initiatives create a more engaging learning experience that empowers students, teachers, and administrators alike.
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.