Welcome!

Related Topics: @CloudExpo, Java IoT, Linux Containers, Containers Expo Blog, Agile Computing, @BigDataExpo

@CloudExpo: Article

Harnessing Big Data for Product Improvement

How UK data solutions developer Systems Mechanics uses HP Vertica for BI, streaming and data analysis

Three years ago, Systems Mechanics Limited used relational databases to assemble and analyze some 20 different data sources in near real-time. But most relational database appliances used 1980s technical approaches, and the ability to connect more data and manage more events capped off. The runway for their business expansion just ended.

So Systems Mechanics looked for a platform that scales well and provides real-time data analysis, too. At the volumes and price they needed, HP Vertica has since scaled without limit ... an endless runway.

To learn more about how Systems Mechanics improved how their products best deliver business intelligence (BI), analytics streaming, and data analysis, BriefingsDirect spoke with Andy Stubley, Vice President of Sales and Marketing at Systems Mechanics, based in London. The discussion, at the HP Discover conference in Barcelona, is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: You've been doing a lot with data analysis at Systems Mechanics, and monetizing that in some very compelling ways.

Stubley: Yes, indeed. System Mechanics is principally a consultancy and a software developer. We’ve been working in the telco space for the last 10-15 years. We also have a history in retail and financial services.

Stubley

The focus we've had recently and the products we’ve developed into our Zen family are based on big data, particularly in telcos, as they evolve from principally old analog conversations into devices where people have smartphone applications -- and data becomes ever more important.

All that data and all those people connected to the network cause a lot more events that need to be managed, and that data is both a cost to the business and an opportunity to optimize the business. So we have a cost reduction we apply and a revenue upside we apply as well.

Quick example

Gardner: What’s a typical way telcos use Zen, and that analysis?

Stubley: Let’s take a scenario where you’re looking in network and you can’t make a phone call. Two major systems are catching that information. One is a fault-management system that’s telling you there is a fault on the network and it reports that back to the telecom itself.

The second one is the performance management system. That doesn’t specify faults basically, but it tells you if you’re having things like thresholds being affected, which may have an impact on performance every time. Either of those can have an impact on your customer, and from a customer’s perspective, you might also be having a problem with the network that isn’t reported by either of the systems.

We’re finding that social media is getting a bigger play in this space. Why is that? Now, particular the younger populations with consumer-based telcos, mobile telcos particularly, if they can’t get a signal or they can’t make a phone call, they get onto social media and they are trashing the brand.

They’re making noise. A trend is combining fault management and performance management, which are logical partners with social media. All of a sudden, rather than having a couple of systems, you have three.

In our world, we can put 25 or 30 different data sources on to a single Zen platform. In fact, there is no theoretical limit to the number we could, but 20 to 30 is quite typical now. That enables us to manage all the different network elements, different types of mobile technologies, LTE, 3G, and 2G. It could be Ericsson, Nokia, Huawei, ZTE, or Alcatel-Lucent. There is an amazing range of equipment, all currently managed through separate entities. We’re offering a platform to pull it all together in one unit.

The other way I tend to look at it is that we’re trying to turn the telcos into how you might view a human. We take the humans as the best decision-making platforms in the world and we probably still could claim that. As humans, we have conscious and unconscious processes running. We don’t think about breathing or pumping our blood around our system, but it’s happening all the time.

We use a solution with visualization, because in the world of big data, you can’t understand data in numbers.

We have senses that are pulling in massive amount of information from the outside world. You’re listening to me now. You’re probably doing a bunch of other things while you are tapping away on a table as well. They’re getting senses of information there and you are seeing, and hearing, and feeling, and touching, and tasting.

Those all contain information that’s coming into the body, but most of the activity is subconscious. In the world of big data, this is the Zen goal, and what we’re delivering in a number of places is to make as many actions as possible in a telco environment, as in a network environment, come to that automatic, subconscious state.

Suppose I have a problem on a network. I relate it back to the people who need to know, but I don’t require human intervention. We’re looking a position where the human intervention is looking at patterns in that information to decide what they can do intellectually to make the business better.

That probably speaks to another point here. We use a solution with visualization, because in the world of big data, you can’t understand data in numbers. Your human brain isn’t capable of processing enough, but it is capable of identifying patterns of pictures, and that’s where we go with our visualization technology.

Gather and use data

We have a customer who is one of the largest telcos in EMEA. They’re basically taking in 90,000 alarms from the network a day, and that’s their subsidiary companies, all into one environment. But 90,000 alarms needing manual intervention is a very big number.

Using the Zen technology, we’ve been able to reduce that to 10,000 alarms. We’ve effectively taken 90 percent of the manual processing out of that environment. Now, 10,000 is still a lot of alarms to deal with, but it’s a lot less frightening than 90,000, and that’s a real impact in human terms.

Gardner: Now that we understand what you do, let’s get into how you do it. What’s beneath the covers in your Zen system that allows you to confidently say you can take any volume of data you want?

If we need more processing power, we can add more services to scale transparently. That enables us to get any amount of data, which we can then process.

Stubley: Fundamentally, that comes down to the architecture we built for Zen. The first element is our data-integration layer. We have a technology that we developed over the last 10 years specifically to capture data in telco networks. It’s real-time and rugged and it can deal with any volume. That enables us to take anything from the network and push it into our real-time database, which is HP’s Vertica solution, part of the HP HAVEn family.

Vertica analysis is to basically record any amount of data in real time and scale automatically on the HP hardware platform we also use. If we need more processing power, we can add more services to scale transparently. That enables us to get any amount of data, which we can then process.

We have two processing layers. Referring to our earlier discussion about conscious and subconscious activity, our conscious activity is visualizing that data, and that’s done with Tableau.

We have a number of Tableau reports and dashboards with each of our product solutions. That enables us to envision what’s happening and allows the organization, the guys running the network, and the guys looking at different elements in the data to make their own decisions and identify what they might do.

We also have a streaming analytics engine that listens to the data as it comes into the system before it goes to Vertica. If we spot the patterns we’ve identified earlier “subconsciously,” we’ll then act on that data, which may be reducing an alarm count. It may be "actioning" something.

It may be sending someone an email. It may be creating a trouble ticket on a different system. Those all happen transparently and automatically. It’s four layers simplifying the solution: data capture, data integration, visualization, and automatic analytics.

Developing high value

Gardner: And when you have the confidence to scale your underlying architecture and infrastructure, when you are able to visualize and develop high value to a vertical industry like a telco, this allows you to then expand into more lines of business in terms of products and services and also expand into move vertical. Where have you taken this in terms of the Zen family and then where do you take this now in terms of your market opportunity?

Stubley: We focus on mobile telcos. That’s our heritage. We can take any data source from a telco, but we can actually take any data source from anywhere, in any platform and any company. That ranges from binary to HTML. You name it, and if you’ve got data, we could load it.

That means we can build our processing accordingly. What we do is position what we call solution packs, and a solution pack is a connector to the outside world, to the network, and it grabs the data. We’ve got an element of data modeling there, so we can load the data into Vertica. Then, we have already built reports in Tableau that allows us to interrogate automatically. That’s at a component level.

Once you go to a number of components, we can then look horizontally across those different items and look at the behaviors that interact with each other. If you are looking at pure telco terms, we would be looking at different network devices, the end-to-end performance of the network, but the same would apply to a fraud scenario or could apply to someone who is running cable TV.

The very highest level is finding what problem you’re going to solve and then using the data to solve it.

So multi-play players are interesting because they want to monitor what’s happening with TV as well and that will fit in exactly in the same category. Realistically, anybody with high-volume, real-time data can take benefit from Vertica.

Another interesting play in this scenario is social gaming and online advertising. They all have similar data characteristics, very high volume and fixed data that needs to be analyzed and processed automatically.

Why Vertica?

Gardner: How long have you been using Vertica, and what is it that drove you to using it vis-à-vis alternatives?

Stubley: As far as the Zen family goes, we have used other technologies in the past, other relational databases, but we’ve used Vertica now for more than two-and-a-half years. We were looking for a platform that can scale and would give us real-time data. At the volumes we were looking at nothing could compete with Vertica at a sensible price. You can build yourself any solid solution with enough money, but we haven’t got too many customers who are prepared to make that investment.

So Vertica fits in with the technology of the 21st century. A lot of the relational database appliances are using 1980 thought processes. What’s happened with processing in the last few years is that nobody shares memory anymore, and our environment requires a non-shared memory solution. Vertica has been built on that basis. It was scaled without limit.

One of the areas we’re looking at that I mentioned earlier was social media. Social media is a very natural play for Hadoop, and Hadoop is clearly a very cost-effective platform for vast volumes of data at real-time data load, but very slow to analyze.

So the combination with a high-volume, low-cost platform for the bulk of data and a very high performing real-time analytics engine is very compelling. The challenge is going to be moving the data between the two environments. That isn’t going to go away. That’s not simple, and there is a number of approaches. HP Vertica is taking some.

There is Flex Zone, and there are any number of other players in that space. The reality is that you probably reach an environment where people are parallel loading the Hadoop and the Vertica. That’s what we probably plan to do. That gives you much more resilience. So for a lot of the data we’re putting into our system, we’re actually planning to put the raw data files into Hadoop, so we can reload them as necessary to improve the resilience of the overall system, too.

You may also be interested in:

More Stories By Dana Gardner

At Interarbor Solutions, we create the analysis and in-depth podcasts on enterprise software and cloud trends that help fuel the social media revolution. As a veteran IT analyst, Dana Gardner moderates discussions and interviews get to the meat of the hottest technology topics. We define and forecast the business productivity effects of enterprise infrastructure, SOA and cloud advances. Our social media vehicles become conversational platforms, powerfully distributed via the BriefingsDirect Network of online media partners like ZDNet and IT-Director.com. As founder and principal analyst at Interarbor Solutions, Dana Gardner created BriefingsDirect to give online readers and listeners in-depth and direct access to the brightest thought leaders on IT. Our twice-monthly BriefingsDirect Analyst Insights Edition podcasts examine the latest IT news with a panel of analysts and guests. Our sponsored discussions provide a unique, deep-dive focus on specific industry problems and the latest solutions. This podcast equivalent of an analyst briefing session -- made available as a podcast/transcript/blog to any interested viewer and search engine seeker -- breaks the mold on closed knowledge. These informational podcasts jump-start conversational evangelism, drive traffic to lead generation campaigns, and produce strong SEO returns. Interarbor Solutions provides fresh and creative thinking on IT, SOA, cloud and social media strategies based on the power of thoughtful content, made freely and easily available to proactive seekers of insights and information. As a result, marketers and branding professionals can communicate inexpensively with self-qualifiying readers/listeners in discreet market segments. BriefingsDirect podcasts hosted by Dana Gardner: Full turnkey planning, moderatiing, producing, hosting, and distribution via blogs and IT media partners of essential IT knowledge and understanding.

Latest Stories
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
"Avere Systems is a hybrid cloud solution provider. We have customers that want to use cloud storage and we have customers that want to take advantage of cloud compute," explained Rebecca Thompson, VP of Marketing at Avere Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We formed Formation several years ago to really address the need for bring complete modernization and software-defined storage to the more classic private cloud marketplace," stated Mark Lewis, Chairman and CEO of Formation Data Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
Most organizations prioritize data security only after their data has already been compromised. Proactive prevention is important, but how can you accomplish that on a small budget? Learn how the cloud, combined with a defense and in-depth approach, creates efficiencies by transferring and assigning risk. Security requires a multi-defense approach, and an in-house team may only be able to cherry pick from the essential components. In his session at 19th Cloud Expo, Vlad Friedman, CEO/Founder o...
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...