Related Topics: Open Source Cloud, Machine Learning , Artificial Intelligence, @ThingsExpo

Open Source Cloud: Blog Post

Five Reasons Why AI Is Suddenly Accessible | @ThingsExpo #AI #ML #IoT #M2M

In recent years AI has grown significantly and become a substantial area of business investment

What's Different Now: Five Reasons Why AI Is Suddenly Accessible
By Mark Troester

In recent years AI has grown significantly and become a substantial area of business investment. What has changed, and how does this affect you?

For a long time, artificial intelligence was pure science fiction, relegated to books, television and movies—and you don’t need us to tell you that we are well past that point today. In the last few years we have seen extremely rapid advancement in a series of technologies that have come together to unlock a wave of AI investment. According to Accenture, 85% of executives plan to invest extensively in AI in the next three years, and in the same time period, Forrester estimates that businesses using AI will “steal” $1.2 trillion from companies that don’t.

Whether they have implemented AI into their business plans or not, most organizations are now spending considerable time and money thinking about it. Why is AI becoming so accessible today? There are five major reasons driving this change.

1. The Internet of Things
More machines are instrumented than ever before, as sensors have proliferated across devices from connected cars to industrial machinery. The result is an explosion of data that is being collected from myriad sources, providing businesses with the raw material needed for powerful analytics.

2. Data Lakes
The deluge of data would be of limited impact if it were locked into silos and hard to access. Organizations, aided by new technologies, have become more adept at restructuring data into data lakes, providing a single place where data from across the company can be effectively analyzed together.

3. Computational Infrastructure
With so much data being collected and subsequently analyzed all at once, an enormous amount of computing power is required to conduct an analysis. Fortunately, computational infrastructure has never been more powerful or readily available, including the ability to run workloads in parallel.

4. Machine Learning Advances
Machine learning has advanced tremendously quickly, and learning algorithms have become increasingly powerful and capable. They are now well suited to solving a variety of complex problems, from predicting the durability of production machinery to anticipating or even preventing recalls, not to mention identifying images and winning at Go.

AI is Ready, but Still Hard to Scale
These trends have made it possible to incorporate AI into a line of business and produce impressive results—but it’s often only feasible for the digital giants. That’s because implementing AI at scale still requires expensive data scientist or analytics resources to generate and deploy accurate models. However, what if we could automate this last step, using cognitive capabilities to create and improve our models? It’s the fifth reason that moves AI from possible to truly accessible.

5. Meta-Learning Automates Machine Learning
With meta-learning, the process of creating and tuning your models is automated, resulting in increased accuracy and faster results. Importantly, it also greatly reduces the need to invest in expensive and hard to find data scientists, allowing you to run leaner and produce even stronger results. This growing approach is key to enabling organizations across a wide spectrum of markets to implement AI solutions that can produce powerful results, and to making sure you get your share of the $1.2 trillion Forrester estimates is up for grabs.

Learn More about Meta-Learning
This is part of the cognitive-first future that is coming soon for many industries, but for industrial IoT, for example, this future is already here. DataRPM has pioneered meta-learning in the field of cognitive predictive maintenance (CPdM) for industrial IoT. Read about how they have operationalized the insights derived by data, and you can learn more about their platform right here.

We believe that CPdM is just the tip of the iceberg for meta-learning. Imagine the potential of this technology when it is applied to healthcare, hospitality, government and virtually all other industries that are grappling with the unprecedented explosion of data and how to best make sense of it all.

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.