Welcome!

Related Topics: Linux Containers, @CloudExpo

Linux Containers: Press Release

Now, Update Linux Servers with No Downtime

KernelCare automatically updates Linux servers with no need to re-boot, no need to delay applying security patches

With KernelCare, now available from CloudLinux, scheduled outages for security patches on Linux servers are now a thing of the past, giving organizations real-time updates.

KernelCare automatically applies Linux server security updates without having to re-boot, freeing technical personnel from the laborious process that takes several minutes for every server, several times a year.

"This is the equivalent of changing the engine on an airplane while it's flying," said Dan Olds, principal analyst, Gabriel Consulting Group. "I think this will be viewed as a no brainer purchase when you consider the cost of less than $50 annually per server for having the protection of kernel security updates without downtime."

"In our experience, KernelCare has worked perfectly and we love it because we no longer have to suffer through performance issues related to re-booting servers," said Wouter de Vries, founder and CEO of Antagonist, a web hosting provider in the Netherlands (www.antagonist.nl). "Plus, now we don't have to wait to find a window of opportunity to apply security updates because those are done automatically as soon as they're available."

KernelCare loads patches using one module for greatest efficiency with no impact on performance since those take only nanoseconds. For container virtualization, like OpenVZ, there is only one Linux kernel to update no matter how many virtual servers are running on the physical host. If there are 100 virtual servers running on one physical host, each would have needed to be stopped and restarted. With KernelCare, patches are done automatically as soon as they become available, while the machine continues to run - no delay, no downtime.

"Today, system administrators have to re-boot a server to apply the latest kernel security updates, which come out every one to two months," said Igor Seletskiy, founder and CEO, CloudLinux. "However, because they require a scheduled update (to minimize disruptions from downtime), they are often delayed - sometimes months or even years - which means the server is running with known security vulnerabilities. The problem of having to schedule downtime and then update and re-boot servers in a short period of time is a strain on resources for enterprises of every size. KernelCare solves this update and re-boot issue by providing live kernel patching without the need for the re-boot."

Availability and Pricing

KernelCare is available via monthly subscription of $3.95 per server, ordering instructions can be found at http://kernelcare.com/pricing.php. The kernel module is released under GPL2 while other components are distributed in binary-only format under license. KernelCare has been available in limited availability to web hosting providers for the past month and is already installed on more than 500 servers. KernelCare is now available for CentOS 6, Red Hat Enterprise Linux (RHEL) 6, CloudLinux OS 6 and OpenVZ (64-bit only). CloudLinux plans to add support for Debian and Ubuntu, as well as CentOS 5, RHEL 5, CloudLinux OS 5 within the next 60 days. In addition, RHEL 7 will be supported once it is out of beta.

About CloudLinux

CloudLinux, founded in 2009, is a privately-held company with headquarters in Princeton, N.J. and development based in Donetsk, Ukraine. The company has technical expertise in kernel development with customers that include 2,000 service providers worldwide. Its CloudLinux OS is used on more than 18,000 servers for increased server stability and security, which brings far greater efficiency to Web host providers. For more information, visit www.cloudlinux.com.

More Stories By Glenn Rossman

Glenn Rossman has more than 25 years communications experience working at IBM and Hewlett-Packard, along with startup StorageApps, plus agencies Hill & Knowlton and G&A Communications. His experience includes media relations, industry and financial analyst relations, executive communications, intranet and employee communications, as well as producing sales collateral. In technology, his career includes work in channel partner communications, data storage technologies, server computers, software, PC and UNIX computers, along with specific industry initiatives such as manufacturing, medical, and finance. Before his latest stint in technology, Glenn did business-to-business public relations on behalf of the DuPont Company for its specialty polymers products and with the largest steel companies in North America in an initiative focused on automakers.

Latest Stories
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web ...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...