Click here to close now.


News Feed Item

Security Rapid Response Bulletin: Remediation for Heartbleed Vulnerability Requires Keys and Certificates to Be Replaced

Venafi Offers the Only Solution to Find and Fix Vulnerable Cryptographic Keys and Digital Certificates Across the Enterprise

SALT LAKE CITY, UT -- (Marketwired) -- 04/09/14 -- Venafi, the inventor of Next-Generation Trust Protection systems, today warns that the most devastating vulnerability of 2014 and beyond comes from failing to replace all keys and certificates on systems impacted by the OpenSSL Heartbleed bug. Without replacing keys and certificates, Heartbleed leaves open doors into Global 2000 organizations and governments with perpetual security vulnerabilities since attackers can spoof legitimate websites, decrypt private communications, and steal the most sensitive data.

The Heartbleed OpenSSL vulnerability impacts at least 50% of the public facing webservers on the Internet, enabling attackers for the last 3 years to extract private keys, digital certificates and other sensitive data. Keys and certificates establish the trust businesses and government rely on for secure banking, ecommerce, and private communications. Attacks that take advantage of the recently publicized vulnerability are an order of magnitude larger than the Target Corporation data breach reported late last year. This is because this vulnerability affects virtually every organization that uses the internet and is one that can be exploited by simply visiting a website and taking advantage of the vulnerability. No special skills or tools are required.

Register and attend a live webinar for more information on responding to Heartbleed at

To close the door on these vulnerabilities, organizations should follow these recommendations:

  • Identify all public facing servers using OpenSSL 1.0.1 - 1.0.1f and upgrade to OpenSSL 1.0.1g
  • Identify keys and certificates to fix based on knowledge of vulnerable applications
  • Generate new keys and X.509 certificates
  • Install new keys and certificates on servers, revoke vulnerable certificates

As simple as these steps sound, many organizations are challenged to carry them out.

"While the Heartbleed code has been fixed, it is alarming that many organizations remain vulnerable. Most Global 2000 organizations and governments don't have a clear path to quickly change out the thousands of affected and exposed keys and certificates in order to ensure security," says Jeff Hudson, CEO of Venafi. "But if they don't change out every one of those keys and certificates quickly, the continued exposure to Heartbleed means attackers can keep spoofing legitimate websites, decrypting private communications, and stealing the most sensitive data."

Venafi can help affected organizations identify and change all the SSL keys and certificates that are vulnerable. Venafi's business is to help organizations move from a vulnerable situation to a safe, secure, and trusted state. Organizations can request help at

Venafi's incident response to Heartbleed includes Venafi TrustAuthority™ which identifies and replaces vulnerable keys and certificates. TrustAuthority builds an intelligent inventory of keys and certificates, understands how they're used, identifies vulnerabilities, and replaces them. Further, TrustAuthority continuously monitors the certificates and detects and remediates anomalies as they are identified on an ongoing basis. In other words, get from vulnerable to secure and stay that way.

Many organizations that are Venafi customers today, have rapidly responded to Heartbleed and are back to a known secure state using Venafi TrustForce. TrustForce fully automates the protection of keys and certificates enabling organizations to protect hundreds of thousands of keys and certificates and respond by automatically changing keys and certificates in minutes.

Register and attend a live webinar for more information on responding to Heartbleed at

Read the Venafi Customer Security Rapid Response Bulletin here.

To get the latest news and information about Venafi:
Visit the blog at
Follow us on Twitter: @Venafi
Follow us on LinkedIn:
Follow us on Google+:
Like us on Facebook:

About Venafi
Venafi is the leading cybersecurity company in Next-Generation Trust Protection (NGTP). Venafi delivered the first trust protection platform to secure cryptographic keys and digital certificates that every business and government depend on for secure communications, commerce, computing, and mobility. As part of an enterprise infrastructure protection strategy, Venafi Trust Protection Platform prevents attacks on trust with automated discovery and intelligent policy enforcement, detects and reports on anomalous activity and increased threats, and remediates errors and attacks by automatically replacing keys and certificates. Venafi Threat Center provides research and threat intelligence for trust-based attacks. Venafi customers are among the world's most demanding, security-conscious Global 2000 organizations in financial services, insurance, high tech, telecommunications, aerospace, healthcare and retail. Venafi is backed by top-tier venture capital funds, including Foundation Capital, Pelion Venture Partners and Origin Partners. For more information, visit

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect ...
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
As the world moves towards more DevOps and microservices, application deployment to the cloud ought to become a lot simpler. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. In his session at 17th Cloud Expo, Raghavan "Rags" Srinivas, an Architect/Developer Evangeli...
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated a...
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT ...
DevOps is gaining traction in the federal government – and for good reasons. Heightened user expectations are pushing IT organizations to accelerate application development and support more innovation. At the same time, budgetary constraints require that agencies find ways to decrease the cost of developing, maintaining, and running applications. IT now faces a daunting task: do more and react faster than ever before – all with fewer resources.