Welcome!

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

The Intelligence Inside: Cloud Developers Change the World of Analytics

Evidence is mounting that embedding analytics inside apps business people use every day can lead to quantifiable benefits

Slide Deck from Karl Van den Bergh's Cloud Expo Presentation: The Intelligence Inside: How Developers of Cloud Apps Will Change the World of Analytics

We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.

The Increased Focus on Analytics
With the emphasis on data-driven decision-making, it is perhaps not a surprise that the focus on analytics continues to mount. According to IDC's Dan Vesset, 2013 was poised to be the first year that the market for data-driven decision making enabled by business analytics broke through the $100 billion mark. IT executives are also doubling-down on analytics, a fact highlighted by Gartner's annual CIO survey which has put analytics as the number one technology priority three times out of the last five years. So, given the importance and spend on analytics, everyone should have access to the insight they need, right?

Most Business People Still Don't Use Analytics
Amazingly, in spite of spending growth and focus, most information workers today do not have access to business intelligence. In fact, Cindi Howson of BI Scorecard has found that end-user adoption of BI seems to have stagnated at about 25%. This stagnation is difficult to reconcile. How is it possible that, at best, one quarter of information workers have access to what is arguably most critical to their success in a world that runs on data?

There are a variety of reasons for stagnant end-user adoption, including the high costs associated with BI projects and an overall lack of usability. However, the biggest impediment to BI adoption has nothing to do with the technology. The reality is that the vast majority of business decision makers do not spend their day working in a BI tool - nor do they want to. Users already have their preferred tool or application: sales representatives use a CRM service; marketers use a campaign management or marketing automation platform; back-office workers will spend a lot of their day in an ERP application; executives will typically work with their preferred productivity suite, and the list goes on. Unless you are a data analyst, you are not going to want to spend much of your day using a BI tool. But, just because business people prefer not to use a BI tool does not mean they don't want access to pertinent data to bolster better decision-making.

The Need for More Intelligence Inside Applications
What's the solution? Simply put, bring the data TO users inside their preferred applications instead of expecting them to go to a separate BI system to find the report, dashboard or visualization that's relevant to the question at hand. If we want to reach the other 75% of business people who don't have access to a standalone BI product, we have to inject intelligence inside the applications and services they use every day. It is only through more intelligent applications that organizations can benefit from broader data-driven decision-making. In fact, according to Gartner, BI will only become pervasive when it essentially becomes "invisible" to business people as part of the applications they use daily. In a 2013 report highlighting key emerging tech trends, Gartner concludes that in order "to make analytics more actionable and pervasively deployed, BI and analytics professionals must make analytics more invisible and transparent to their users." How? The report explains this will happen "through embedded analytic applications at the point of decision or action."

If the solution to pervasive BI is to deliver greater intelligence inside applications, why don't more applications embed analytics? The reality is that only a small fraction of applications built today have embedded intelligence. Sure, they might have a table or a chart but there is no intelligent engine; users typically can't personalize a report or dashboard or self-serve to generate new visualizations on an ad-hoc basis. The culprit here is that business intelligence was originally intended as a standalone activity, not one that was designed to be embeddable. Specifically, the reasons driving developers to ignore BI platforms boil down to cost and complexity.

Cost and Complexity Are Barriers to Embedded BI
Traditionally, BI tools have carried a user-based licensing model. Licenses typically cost from the tens of thousands to millions of dollars. Such high per-user costs might be justified for a relatively small, predictably-sized population that includes a large percentage of power users who will spend a good amount of time working with the BI tool. This user-based model, however, is totally unsuitable for the embedded use case. The embedded use case is geared toward business users who will access the BI features less frequently and likely have less analytics experience than the traditional power user - in this scenario, high per-user costs simply can't be justified.

BI products are complex on a number of different levels. First, they are complex to deploy, often requiring months if not years to roll out to any reasonable number of users. Second, they are complex to use, both for the developers building the reports and dashboards as well as the business people interacting with the tool. Third, they are complex to embed. Designed as standalone products, BI tools are not architected to plug into another application.

Given the cost and complexity of traditional standalone BI offerings, it is no surprise that developers often turn to charting libraries to deliver the visualizations within their application. The cost is low and they are relatively simple for a developer to embed. In the short term, a charting library is a reasonable solution, but over time falls flat. The demands for more charts, dashboards and reports quickly grow, and end users begin looking for the ability to self-serve and create their own visualizations. As a result of these mounting demands, many application developers find themselves essentially building a BI tool, taking them outside their core competency and stealing precious time away from advancing their own application.

Could a New Generation of Embedded BI Provide the Solution?
Fortunately, there is a new generation of embedded analytic platforms emerging that looks set to address these challenges of cost and complexity. Wayne Eckerson, a noted BI analyst, identifies this as the third generation of embedded analytics in his article on the Evolution of Embedded BI. In summary, Eckerson describes the third generation as "moving beyond the Web to the Cloud" where developers can "rent these Cloud-based BI tools by the hour." These BI platforms can "support a full range of BI functionality including data exploration and authoring" and can be embedded through standard interfaces like REST and JavaScript. So, how does this third-generation address the issues of cost and complexity?

Utility Pricing Dramatically Reduces Cost
To address the challenge of cost, a new generation of embedded analytics platforms employs a utility-based licensing model where the software is available on a per-core, per-hour or per-gigabyte basis. From a developer's perspective, this is a much fairer model, as one only pays for what is used. At the beginning of the application lifecycle when usage is sporadic, developers can limit their costs. As the application becomes successful and use grows, usage can be easily scaled up. A recent report by Nucleus Research concluded that utility pricing for analytics can save organizations up to 70% of what they would pay for a traditional BI solution. I've written previously about how utility pricing will dramatically increase the availability of analytics, reaching a much broader set of organizations. The rapid adoption of Amazon's Redshift data warehousing service and Jaspersoft's reporting and analytics service on the AWS Marketplace provides rich testimony to the benefits of this model.

Cloud and Web-Standard APIs Reduce Complexity
A cloud-based BI platform significantly simplifies deployment, as there is no BI server to install or configure. The Nucleus Research report found that the utility-priced, Cloud BI solutions could be deployed in weeks or even days as opposed to the months commonly required for a traditional BI product.

Leveraging web-standard APIs like REST and JavaScript, the third-generation platforms also simplify the task of embedding analytics both on the front-end and back-end of the application. Importantly, these APIs allow full-featured, self-service BI capabilities to be embedded, not just reports and dashboards. This means increased ability of the application to respond to the ad-hoc information requests of business users.

The Benefits of Embedded Intelligence
Intuitively, it would seem that, by providing analytics within the applications business people use every day, an organization should experience the benefits of more data-driven decision-making. But is there any proof?

A recent report by the Aberdeen Group, based on data from over 130 organizations, has helped shed light on some of the benefits of embedded analytics. First, as might be expected, those companies using embedded analytics saw 76% of users actively engaged in analytics versus only 11% for those with the lowest embedded BI adoption. As a result, 89% of the business people in these best-in-class companies were satisfied with their access to data versus only 21% in the industry laggards. The bottom line? Companies leading embedded BI adoption saw an average 19% increase in operating profit versus only 9% for the other companies.

Andre Gayle, who helps manage a voicemail service at British Telecom, illustrates the difference embedded analytics can make. "We had reports [before] but they had to be emailed to users, who had to wait for them, then dig through them as needed. It was inefficient and wasteful." Now, thanks to embedded analytics, British Telecom has seen a huge savings in time and cost. As Gayle explains, capacity planning for the voicemail service used to be a "laborious exercise, involving several days of effort to dig up the numbers " but now can be done "on demand, in a fact-based manner, in just a few minutes."

The evidence is mounting that embedding analytics inside the applications business people use every day can lead to quantifiable benefits. However, the protagonist here, unlike in the traditional world of analytics, must be the developer, not the analyst. A new generation of embedded BI platforms is making it easier and more cost effective for developers to deliver the analytical capabilities needed inside the Cloud applications they are building. As developers increasingly avail of these new platforms, we can hope that BI will finally become pervasive as an information service that informs day-to-day operations. As Wayne Eckerson puts it, "In many ways, embedded BI represents the fulfillment of BI's promise." Now it's up to Cloud developers to help us realize that promise.

More Stories By Karl Van den Bergh

Karl Van den Bergh is the Vice President of Product Strategy at Jaspersoft, where he is responsible for product strategy, product management and product marketing. Karl is a seasoned high-tech executive with 18 years experience in software, hardware, open source and SaaS businesses, both startup and established.

Prior to Jaspersoft, Karl was the Vice President of Marketing and Alliances at Kickfire, a venture-funded data warehouse appliance startup. He also spent seven years at Business Objects (now part of SAP), where he held progressively senior leadership positions in product marketing, product management, corporate development and strategy – ultimately becoming the General Manager of the Information-On-Demand business. Earlier in his career, he was responsible for EMEA marketing at ASG, one of the world’s largest privately-held software companies. Karl started his career as a software engineer.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
Given the popularity of the containers, further investment in the telco/cable industry is needed to transition existing VM-based solutions to containerized cloud native deployments. The networking architecture of the solution isolates the network traffic into different network planes (e.g., management, control, and media). This naturally makes support for multiple interfaces in container orchestration engines an indispensable requirement.
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continu...
In his session at @ThingsExpo, Arvind Radhakrishnen discussed how IoT offers new business models in banking and financial services organizations with the capability to revolutionize products, payments, channels, business processes and asset management built on strong architectural foundation. The following topics were covered: How IoT stands to impact various business parameters including customer experience, cost and risk management within BFS organizations.