|February 26, 2017 05:30 PM EST||
Software-defined infrastructure (SDI) is one of the most popular and ongoing trends within data centers and cloud environments leading to better infrastructure flexibility by developing the infrastructure based on scripts respectively source code. However, SDIs are becoming more and more just a means to an end since automation, sophisticated algorithms and thus intelligent systems are leading to an Artificial Intelligence defined Infrastructure (AI-defined Infrastructure - AiDI).
SDI is not dead but only a means to an end
An SDI abstracts the software from the hardware layer. Thus, the intelligence is not implemented into the hardware components anymore and has been moved into a software stack. Based on software and a high degree of automation, an SDI is designed to build and control an infrastructure mostly without human interaction.
Typical SDI environments, e.g., cloud infrastructure, are built by developing scripts or source code. The software includes all the necessary commands to configure an entire infrastructure environment, including the applications and services running on top. An SDI-based infrastructure works independent from a certain hardware environment. Thus, a complete infrastructure can be exchanged by software - regardless of the underlying hardware components. One reason why SDIs are the fundamental foundation of today's cloud infrastructure environments.
However, an SDI is everything but intelligent. It is based on static source code including hard coded commands for taking certain actions automatically.
Understanding the AI-Defined Infrastructure (AiDI)
A software-defined infrastructure is an important concept for building and running dynamic IT environments. However, an SDI is limited to the static source code as well as the skillset of the responsible developer respectively administrator who is writing the scripts/ code for the environment. Furthermore, a SDI is only dynamic to a certain level since it is not able to understand or learn the own environment it is operating.
An Artificial Intelligence defined Infrastructure (AI-defined Infrastructure - AiDI) enhances a SDI with the necessary sophisticated algorithms, machine learning and artificial intelligence - fueling the SDI with intelligence. An AiDI allows a SDI to build and run self-learning respectively self-healing infrastructure environments. Thus, without human interaction AI-defined IT infrastructure environments are capable of:
- Deploying the necessary resources depending on the workload requirements as well as de-allocating the resources when they are not needed anymore.
- Constantly analyzing the ever-changing behavior and status of every single infrastructure component and thus understanding itself.
- Reacting or proactively acting based on the status of single infrastructure components by autonomously taking actions and thus leading the entire infrastructure into an error-free status.
An AI-defined Infrastructure cannot be compared with classic automation software, which typically works with predefined scripts. An AI-defined Infrastructure utilizes a company's existing knowledge executing it automatically and independently. However, like every new born organism an AI-defined Infrastructure needs to be trained but afterwards can work autonomously. Thus, based on the learned knowledge, disturbances can be solved - even proactively for not expected events by connecting appropriate incidents from the past. Therefore, an AI-defined Infrastructure monitors and analyzes all responding components in real-time to identify and solve a problem based on its existing knowledge. The more incidents are solved the bigger the infrastructure knowledge gets. The core of an AI-defined Infrastructure is a knowledge-based architecture that can analyze incidents and changes and autonomously develop strategies to solve an issue.
Furthermore, an AI-defined Infrastructure embraces communities to:
- Consume the knowledge from external experts to become more intelligent.
- Connect with other AI-defined Infrastructure environments to link, combine and share their knowledge base.
- Constantly expand the knowledge pool.
- Optimize the knowledge.
All in all, an AI-defined Infrastructure is an intelligent system that - initially fueled with external knowledge - can learn and make decisions autonomously without human interaction.
The AiDI is only a single piece of the entire AI-defined enterprise stack
An AI-defined Infrastructure is an essential part of today's IT operations building the foundation for the AI-enabled enterprise. However, first and foremost it enables IT departments changing the infrastructure behavior from a today's semi-dynamic to a true real-time IT environment.
This autonomous way of planning, building, running and maintaining the entire infrastructure let IT operations and developers deploy IT resources like server, storage, network, databases and other ready services in the most efficient way - by using the knowledge of more than just one expert but the entire IT operations team. Furthermore, IT operations are being transformed from pure consumers of resources to orchestrators respectively managers of a completely automated and intelligent IT stack. The very foundation of an end-to-end AI-ready enterprise.
Zerto exhibited at SYS-CON's 18th International Cloud Expo®, which took place at the Javits Center in New York City, NY, in June 2016. Zerto is committed to keeping enterprise and cloud IT running 24/7 by providing innovative, simple, reliable and scalable business continuity software solutions. Through the Zerto Cloud Continuity Platform™, organizations can seamlessly move and protect virtualized workloads between public, private and hybrid clouds. The company’s flagship product, Zerto Virtual...
Feb. 27, 2017 01:30 AM EST Reads: 1,849
Some people worry that OpenStack is more flash then substance; however, for many customers this could not be farther from the truth. No other technology equalizes the playing field between vendors while giving your internal teams better access than ever to infrastructure when they need it. In his session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will talk through some real-world OpenStack deployments and look into the ways this can benefit customers of all sizes....
Feb. 27, 2017 01:15 AM EST Reads: 1,633
In recent years, containers have taken the world by storm. Companies of all sizes and industries have realized the massive benefits of containers, such as unprecedented mobility, higher hardware utilization, and increased flexibility and agility; however, many containers today are non-persistent. Containers without persistence miss out on many benefits, and in many cases simply pass the responsibility of persistence onto other infrastructure, adding additional complexity.
Feb. 27, 2017 01:15 AM EST Reads: 1,596
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, will present an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He will expound on the industry issues he frequently came up against as an analyst, and...
Feb. 27, 2017 01:00 AM EST Reads: 1,988
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Feb. 27, 2017 01:00 AM EST Reads: 1,241
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Feb. 27, 2017 12:30 AM EST Reads: 8,776
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Feb. 26, 2017 11:45 PM EST Reads: 6,214
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Feb. 26, 2017 11:45 PM EST Reads: 2,095
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
Feb. 26, 2017 10:00 PM EST Reads: 13,712
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and 21st International Cloud Expo, which will take place in November in Silicon Valley, California.
Feb. 26, 2017 09:15 PM EST Reads: 2,462
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Feb. 26, 2017 09:00 PM EST Reads: 9,006
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
Feb. 26, 2017 08:45 PM EST Reads: 2,033
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
Feb. 26, 2017 08:45 PM EST Reads: 5,188
SYS-CON Events announced today that Hitrons Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Hitrons Solutions Inc. is distributor in the North American market for unique products and services of small and medium-size businesses, including cloud services and solutions, SEO marketing platforms, and mobile applications.
Feb. 26, 2017 08:30 PM EST Reads: 762
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Feb. 26, 2017 08:30 PM EST Reads: 7,211