Welcome!

Related Topics: @BigDataExpo, Java IoT, Linux Containers, @CloudExpo, Cloud Security

@BigDataExpo: Blog Post

How Big Data Can Boost Public Safety

When the 49ers Win, Crime Rates Go Up

Unless you are an IT manager, understanding all of the technical details of how big data technology — such as Hadoop Hive or Apache Pig as a Service — works isn’t as important as understanding how big data can be used to improve lives. Public safety, whether it’s reducing crime rates or maintaining roads and bridges, is a huge undertaking with serious consequences when mistakes are made. Big data can provide additional insights (such as the connection between home games and crime rates discussed below) to governments and businesses to help them keep cities safe, and better respond to disasters when they strike.

Bridge Maintenance
The U.S. Department of Transportation keeps track of more than 600,000 bridges, all of which are more than 20 feet long and used for vehicular traffic. Of those bridges, 25 percent have been classified as deficient. Unfortunately, repairing bridges is an expensive endeavor costing billions of dollars, and until recently, the process of inspecting bridges for problems was slow and inaccurate (since each bridge had to be inspected manually). Occasionally even small mistakes can lead to big consequences, such as when a bridge collapsed in Minneapolis, killing 13 people and injuring 145.

Now, governments are taking advantage of big data to help them better track which bridges need maintenance in order to keep the bridges safe and cut down on inspection time. By placing sensors in certain areas of a bridge, it is possible to track how certain factors are affecting the structural integrity of the bridge. For example, sensors could track everything from how seismic activity or a heavy winter storm has had an impact on the structure, all the way down to the day to day wear and tear from traffic. These sensors allow all data to be collected in a central location without sending an inspector out to each individual bridge, and for crucial data to be analyzed in realtime, so serious problems can be addressed faster. This data is also useful for distributing a limited budget, so the more important issues are addressed first.

Cracking Down on Crime
When do crime rates tend to peak in a particular neighborhood? What streets, businesses or events tend to attract higher crime rates? Which areas of the city need a higher police presence? For those familiar with their neighborhood and city, the answer to these questions may seem obvious, but for police agencies in charge of several neighborhoods, getting a clear picture of what is happening on every street can be difficult. With access to more data, however, such as historical crime records, presence of graffiti and the location of police stations, cities can better distribute their resources to reduce crime.

In California, some high school students completed a big data project analyzing crime rates in San Francisco. After going through millions of data points, the students found some interesting insights. For instance, there were 10 liquor stores, all within a 2-mile radius of each other, where a huge majority of crimes occur. A particular area, the 800 block of Bryant street, averages a single crime every three hours, and whenever there is a home game of the Giants or 49ers, there is a 20 percent spike in crime. The spike tended to be higher if the game was on the weekend and if the home team won vs. a weekday game or a loss or tie.

Even though this project was limited to publicly available data, it serves as a good example of what cities and police stations with much more extensive data sources could do by analyzing big data.

Boost Utility Reliability
When natural disasters hit, countless citizens are often left without power. If these storms happen to hit during the winter months, the loss of power can be particularly devastating, as residents are left without heat. Even smaller storms can cause serious consequences when traffic lights stop working, and the elderly or small children are left without heat. Due to this, utility companies have started analyzing the data they collect to improve response times to power outages and to better distribute resources when a larger storm strikes.

With so much data readily available, it makes sense for utility companies, departments and cities to start using that data to improve public safety, and hopefully cut costs and improve efficiency in the long run.

Image Source: Wikimedia

More Stories By Gil Allouche

Gil Allouche is the Vice President of Marketing at Qubole. Most recently Sr. Director of Marketing for Karmasphere, a leading Big Data Analytics company offering SQL access to Apache Hadoop, where he managed all marketing functions, Gil brings a keen understanding of the Big Data target market and its technologies and buyers. Prior to Karmasphere, Gil was a product marketing manager and general manager for the TIBCO Silver Spotfire SaaS offering where he developed and executed go-to-market plans that increased growth by 600 percent in just 18 months. Gil also co-founded 1Yell, a social media ad network company. Gil began his marketing career as a product strategist at SAP while earning his MBA at Babson College and is a former software engineer.

Latest Stories
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Zerto exhibited at SYS-CON's 18th International Cloud Expo®, which took place at the Javits Center in New York City, NY, in June 2016. Zerto is committed to keeping enterprise and cloud IT running 24/7 by providing innovative, simple, reliable and scalable business continuity software solutions. Through the Zerto Cloud Continuity Platform™, organizations can seamlessly move and protect virtualized workloads between public, private and hybrid clouds. The company’s flagship product, Zerto Virtual...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
910Telecom exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and exchanges.
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, discussed how Numerex, as an experienced, established IoT provider, has embraced a new m...
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, will highlight the current challenges of these transformative technologies and share strategies for preparing your organization for these changes. This “view from the top” will outline the latest trends and developm...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...