Welcome!

Blog Feed Post

True Democratization of Analytics with Meta-Learning

There are many solutions that claim to democratize analytics, but they are really constrained. A meta-learning approach democratizes without limits.

The democratization of analytics has become a popular term, and a quick Google search will generate results that explore the necessity of empowering more people with analytics and the rise of citizen data scientists. The ability to easily make better use of your (constantly growing) pool of data is a critical driver of business success, but many of the existing solutions that claim to democratize analytics only do so within severe limits. If you have a complex business scenario and are looking to get revolutionary insights using them, it’s easy to come away disappointed.

However, the democratization of analytics isn’t just a buzzword that refers to a narrow approach. It’s possible to do so much more. Let’s quickly review the current state of the market that you’re likely familiar with, and then dive into our proposed solution.

Lightweight Solutions that Oversimplify

One way this type of solution is marketed is as something that’s simple because it works in an environment business leaders are already familiar with, like Excel or Tableau. These solutions tend to be lightweight and are really about easily generating a digestible report. That’s all well and good, but it’s really democratizing report generation and lightweight analysis rather than enabling you to develop truly predictive scenarios that require Machine Learning.

Narrowly Defined Analytics as a Service

Another option that is gaining adoption is to use pre-trained models usable out-of-the-box for image analysis and classification, speech to text conversion, and translation services. While these make certain limited use cases available to more organizations, they don’t actually democratize the predictive analytics processing related to business specific time-series scenarios.

Cloud Environments that are only a Framework

Finally, there are numerous cloud vendors that take care of managing the infrastructure necessary for Big Data analytics and Machine Learning, whether it’s hosting Hadoop/MapReduce, Spark, etc., providing managed database support, or hosting machine intelligence software libraries like TensorFlow. At the end of the day, these options are really democratizing the infrastructure necessary to support Machine Learning—they aren’t democratizing the Data Scientist lifecycle itself, something we discuss in detail a little later in the post.

But What about More Sophisticated Business Scenarios?  

The solutions above may technically “democratize” some form of analytics, but they fall short in democratizing Machine Learning for individual business use cases like predictive maintenance for the Industrial IoT, improving patient outcomes in healthcare, detecting fraud in financial services, etc. So while simple scenarios are becoming a commodity, business scenarios that provide the most value are beyond the reach of most organizations.

Why?

Because the Machine Learning or Data Scientist lifecycle is complex. A successful implementation includes a business requirements phase, data preparation, data modeling, and production deployment work. The last three phases are particularly resource intensive.

  • The data preparation phase involves collecting the data, cleansing the data, and transforming the data—and multiple sets of data are required for scoring and testing.
  • The data modeling phase is especially demanding and involves feature engineering, algorithm selection, testing, tuning and model optimization. These steps need to be repeated until the models reach an acceptable level of quality.
  • Then there is the deployment—you have to take the models and deploy them in production using operational data. The work doesn’t end there, as you must continuously review and revise the models to keep up with changes in the environment.

It’s pretty clear that this is a completely different challenge that the options described above can’t address. While there are cloud options that will manage the infrastructure, and there are tools that make the data scientist more efficient, there is a dearth of solutions that tackle the democratization of complex Machine Learning.

The need for democratization is driven by the amount of time and resources it takes to do this manually—even with a team of data scientists. And for those that don’t have data scientists, this is a non-starter given traditional tools and solutions.

Enter Machine Learning and Meta-Learning

It’s evident that there is a need for a better way forward when it comes to solving these complex business challenges. Data scientists have to be freed from the laborious day to day grind that consumes so much of their time today, enabling them to more effectively support a higher number of business scenarios in less time.

Progress DataRPM is designed specifically to meet this need. By developing an innovative machine automated approach, we are able to automate a range of complex tasks that the other solutions above simply can’t.

  • DataRPM uses a meta-data approach to remember, share and apply learnings from the model experiments. This approach speeds the iterative process required to build and test models, and has also proven to increase the accuracy of production analytic results tremendously. 
  • DataRPM also leverages a novel approach for detecting failures. Traditional methods limit the analytics approach to building models that identify future failure or require optimization strictly based on past failures, but this approach provides poor coverage given that it can’t predict random failures (which are the predominant type of failure). DataRPM instead models normal behavior and then detects deviations from normal. These are flagged as potential problems that can be managed effectively by the business. Next, this intelligence is then fed back into the model so that it is continuously improved based on production data.

This solution allows your team to focus the most strategic and actionable part of the process, which is analyzing and assessing the results. Whether you currently employ data scientists or not, it reduces the amount of time you need to allocate to evaluating and creating complex models.

Rather than constrain analytics and generate a simple or limited result, the meta-learning approach looks fully at the unique problems facing your business, is flexible enough to be adapted to new problems as they arise and is constantly improving. By automating some of the most arduous components of data analysis, you’re free to focus on delivering the insights and outcomes you need—quickly. It's all part of our cognitive-first vision for business applications. You can learn more about our platform for cognitive predictive maintenance here.

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.