Welcome!

Related Topics: Linux Containers, @CloudExpo

Linux Containers: Press Release

Now, Update Linux Servers with No Downtime

KernelCare automatically updates Linux servers with no need to re-boot, no need to delay applying security patches

With KernelCare, now available from CloudLinux, scheduled outages for security patches on Linux servers are now a thing of the past, giving organizations real-time updates.

KernelCare automatically applies Linux server security updates without having to re-boot, freeing technical personnel from the laborious process that takes several minutes for every server, several times a year.

"This is the equivalent of changing the engine on an airplane while it's flying," said Dan Olds, principal analyst, Gabriel Consulting Group. "I think this will be viewed as a no brainer purchase when you consider the cost of less than $50 annually per server for having the protection of kernel security updates without downtime."

"In our experience, KernelCare has worked perfectly and we love it because we no longer have to suffer through performance issues related to re-booting servers," said Wouter de Vries, founder and CEO of Antagonist, a web hosting provider in the Netherlands (www.antagonist.nl). "Plus, now we don't have to wait to find a window of opportunity to apply security updates because those are done automatically as soon as they're available."

KernelCare loads patches using one module for greatest efficiency with no impact on performance since those take only nanoseconds. For container virtualization, like OpenVZ, there is only one Linux kernel to update no matter how many virtual servers are running on the physical host. If there are 100 virtual servers running on one physical host, each would have needed to be stopped and restarted. With KernelCare, patches are done automatically as soon as they become available, while the machine continues to run - no delay, no downtime.

"Today, system administrators have to re-boot a server to apply the latest kernel security updates, which come out every one to two months," said Igor Seletskiy, founder and CEO, CloudLinux. "However, because they require a scheduled update (to minimize disruptions from downtime), they are often delayed - sometimes months or even years - which means the server is running with known security vulnerabilities. The problem of having to schedule downtime and then update and re-boot servers in a short period of time is a strain on resources for enterprises of every size. KernelCare solves this update and re-boot issue by providing live kernel patching without the need for the re-boot."

Availability and Pricing

KernelCare is available via monthly subscription of $3.95 per server, ordering instructions can be found at http://kernelcare.com/pricing.php. The kernel module is released under GPL2 while other components are distributed in binary-only format under license. KernelCare has been available in limited availability to web hosting providers for the past month and is already installed on more than 500 servers. KernelCare is now available for CentOS 6, Red Hat Enterprise Linux (RHEL) 6, CloudLinux OS 6 and OpenVZ (64-bit only). CloudLinux plans to add support for Debian and Ubuntu, as well as CentOS 5, RHEL 5, CloudLinux OS 5 within the next 60 days. In addition, RHEL 7 will be supported once it is out of beta.

About CloudLinux

CloudLinux, founded in 2009, is a privately-held company with headquarters in Princeton, N.J. and development based in Donetsk, Ukraine. The company has technical expertise in kernel development with customers that include 2,000 service providers worldwide. Its CloudLinux OS is used on more than 18,000 servers for increased server stability and security, which brings far greater efficiency to Web host providers. For more information, visit www.cloudlinux.com.

More Stories By Glenn Rossman

Glenn Rossman has more than 25 years communications experience working at IBM and Hewlett-Packard, along with startup StorageApps, plus agencies Hill & Knowlton and G&A Communications. His experience includes media relations, industry and financial analyst relations, executive communications, intranet and employee communications, as well as producing sales collateral. In technology, his career includes work in channel partner communications, data storage technologies, server computers, software, PC and UNIX computers, along with specific industry initiatives such as manufacturing, medical, and finance. Before his latest stint in technology, Glenn did business-to-business public relations on behalf of the DuPont Company for its specialty polymers products and with the largest steel companies in North America in an initiative focused on automakers.

Latest Stories
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of (at least) three separate application components: the software embedded in the device, the back-end service, and the mobile application for the end user’s controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target –...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
StarNet Communications Corp has announced the addition of three Secure Remote Desktop modules to its flagship X-Win32 PC X server. The new modules enable X-Win32 to safely tunnel the remote desktops from Linux and Unix servers to the user’s PC over encrypted SSH. Traditionally, users of PC X servers deploy the XDMCP protocol to display remote desktop environments such as the Gnome and KDE desktops on Linux servers and the CDE environment on Solaris Unix machines. XDMCP is used primarily on comp...
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
SYS-CON Events announced today that StarNet Communications will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. StarNet Communications’ FastX is the industry first cloud-based remote X Windows emulator. Using standard Web browsers (FireFox, Chrome, Safari, etc.) users from around the world gain highly secure access to applications and data hosted on Linux-based servers in a central data center. ...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....