Welcome!

Related Topics: Cloud Security, Java IoT, Microservices Expo, Linux Containers, @CloudExpo, @BigDataExpo

Cloud Security: Article

Protecting the Network with Proactive Encryption Monitoring

Encryption technology is everywhere: in applications, data centers and other foundation infrastructure

Encryption is a key element of a complete security strategy. The 2013 Global Encryption Trends Study shows a steady increase in the use of encryption solutions over the past nine years. Thirty-five percent of organizations now have an encryption strategy applied consistently across the entire enterprise, up from 29 percent in 2012. The study showed that, for the first time, the main goal for most organizations in deploying encryption is mitigating the effects of data breaches. There is good reason for this shift: the latest Ponemon Institute research reveals that the cost of a data breach is $3.5 million, up 15 percent from last year.

On the surface, the 35 percent figure seems like good news, until one realizes that 65 percent of organizations do not have an enterprise-wide encryption strategy. In addition, even a consistently applied strategy can lack visibility, management controls or remediation processes. This gives hackers the green light to attack as soon as they spot a vulnerability.

While organizations are moving in the right direction when it comes to encryption, much more needs to be done - and quickly. Encryption has come to be viewed as a commodity: organizations deploy it and assume they've taken the steps they need to maintain security. If breaches occur, it's rarely the fault of the software or the encryption protocol. The fault lies rather in the fact that encryption management is left in the domain of IT system administrators and has never been properly managed with access controls, monitoring or proactive data loss prevention.

Too Many Keys Spoil the Security
While recent high-profile vulnerabilities have exposed the need to manage encrypted networks better, it's important to understand that administrators can cause vulnerabilities as well. In the Secure Shell (SSH) data-in-transit protocol, key-based authentication is one of the more common methods used to gain access to critical information. Keys are easy to create, and, at the most basic level, are simple text files that can be easily uploaded to the appropriate system. Associated with each key is an identity: either a person or machine that grants access to information assets and performs specific tasks, such as transferring a file or dropping a database, depending on the assigned authorizations. In the case of Secure Shell keys, those basic text files provide access to some of the most critical information within an organization.

A quick calculation will reveal that the number of keys assigned over the past decade to employees, contractors and applications can run up to a million or more for a single enterprise. In one example, a major bank with around 15,000 hosts had over 1.5 million keys circulating within its network environment. Around 10 percent of those keys - or 150,000 - provided high-level administrator access. This represents an astonishing number of open doors that no one was monitoring.

It may seem impossible that such a security lapse could happen, but consider that encryption is often perceived merely as a tool. Because nothing appeared on the surface to be out of place, no processes were shut down and the problem was undetected.

Safety Hazards
Forgetting to keep track of keys is one problem; failing to remove them is another. System administrators and application developers will often deploy keys in order to readily gain access to systems they are working on. These keys grant a fairly high level of privilege and are often used across multiple systems, creating a one-to-many relationship. In many cases, employees or contractors who are terminated - or even simply reassigned to other tasks that no longer require the same access - continue to carry access via Secure Shell keys; the assumption is that terminating the account is enough. Unfortunately, this is not the case when Secure Shell keys are involved; the keys must also be removed or the access remains in place.

SSH keys pose another threat as well: subverting privileged access management systems (PAMs). Many PAMs use a gateway or jump host that administrators log into to gain access to network assets. PAM solutions connect with user directories to assign privileges, monitor user actions and record which actions have taken place. While this appears like an airtight way to monitor administrators, it is incredibly easy for an administrator to log into the gateway, deploy a key and then log in using key authentication, thereby circumventing any PAM safeguards in place.

Too Clever for Their Own Good
Poorly monitored access is just one security hazard in encrypted environments. Conventional PAM solutions, which use gateways and focus on interactive users only, are designed to monitor administrator activities. Unfortunately, as mentioned earlier, they end up being fairly easy to work around. Additionally, encryption blinds attackers the same way it blinds security operations and forensics teams. For this reason, encrypted traffic is rarely monitored and is allowed to flow freely in and out of the network environment. This creates obvious risks and negates security intelligence capabilities to a large degree.

The Internet offers many articles on how to use Secure Shell to bypass corporate firewalls. This is actually a fairly common and clever workaround policy that unfortunately creates a huge security risk. In order to eliminate this risk, the organization must decrypt and inspect the traffic.

Traffic Safety
Decrypting Secure Shell traffic would require an organization to use an inline proxy with access to the private keys - essentially a friendly man-in-the-middle - to decrypt the traffic without interfering with the network. When successfully deployed, 100 percent of encrypted traffic for both interactive users and M2M identities can be monitored. Also, because this is done at the network level, it's not possible for malicious parties to execute a workaround. With this method, enterprises can proactively detect suspicious or out-of-policy traffic. This is called encrypted channel monitoring and represents the next generation in the evolution of PAM.

This kind of monitoring solves the issue of decrypting traffic at the perimeter and helps organizations move away from a gateway approach to PAM. At the same time, it prevents attackers from using the organization's own encryption technology against itself. In addition, an organization can use inline access controls and user profiling to control what activities a user can undertake. For example, policy controls can be enforced to forbid file transfers from certain critical systems. With the more advanced solutions, an organization can even block subchannels from running inside the encrypted tunnel, the preferred method of quickly exfiltrating data.

Encryption technologies are often set up without effective monitoring or proper access controls, which also blinds layered defenses. A major vulnerability could potentially compromise the entire server, which could in turn expose other areas of the network to subsequent attacks.

A Healthy Respect for Encryption
Encryption technology is everywhere: in applications, data centers and other foundation infrastructure. While it has been widely embraced, it has also often been abused, misused or neglected. Most organizations have not instituted centralized provisioning, encrypted channel monitoring and other best practices, even though the consequence of inadequate security can be severe. IT security staff may think conventional PAM is keeping their organizations safe, when commonly-known workarounds are instead putting their data in jeopardy.

No one understands better than IT administrators how critical network security is. This understanding should spur security professionals to do all in their power to make their organizations' data as safe as possible. Given all that can go awry, it's important to examine encrypted networks, enabling layered defenses and putting proactive monitoring in place if they have not yet done so. An all-inclusive encrypted channel monitoring strategy will go a long way toward securing the network.

More Stories By Jason Thompson

Jason Thompson is director of global marketing for SSH Communications Security. He brings more than 12 years of experience launching new, innovative solutions across a number of industry verticals. Prior to joining SSH, he worked at Q1 Labs where he helped build awareness around security intelligence and holistic approaches dealing with advanced threat vectors. Mr. Thompson holds a BA from Colorado State University and an MA for the University of North Carolina at Wilmington.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
What is the best strategy for selecting the right offshore company for your business? In his session at 21st Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, will discuss the things to look for - positive and negative - in evaluating your options. He will also discuss how to maximize productivity with your offshore developers. Before you start your search, clearly understand your business needs and how that impacts software choices.
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
As people view cloud as a preferred option to build IT systems, the size of the cloud-based system is getting bigger and more complex. As the system gets bigger, more people need to collaborate from design to management. As more people collaborate to create a bigger system, the need for a systematic approach to automate the process is required. Just as in software, cloud now needs DevOps. In this session, the audience can see how people can solve this issue with a visual model. Visual models ha...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, will discuss how by using...
SYS-CON Events announced today that Taica will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. ANSeeN are the measurement electronics maker for X-ray and Gamma-ray and Neutron measurement equipment such as spectrometers, pulse shape analyzer, and CdTe-FPD. For more information, visit http://anseen.com/.
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
Is advanced scheduling in Kubernetes achievable? Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, will answer these questions and demonstrate techniques for implementing advanced scheduling. For example, using spot instances ...
SYS-CON Events announced today that Daiya Industry will exhibit at the Japanese Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Ruby Development Inc. builds new services in short period of time and provides a continuous support of those services based on Ruby on Rails. For more information, please visit https://github.com/RubyDevInc.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busine...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, will discuss some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he’ll go over some of the best practices for structured team migrat...
As businesses evolve, they need technology that is simple to help them succeed today and flexible enough to help them build for tomorrow. Chrome is fit for the workplace of the future — providing a secure, consistent user experience across a range of devices that can be used anywhere. In her session at 21st Cloud Expo, Vidya Nagarajan, a Senior Product Manager at Google, will take a look at various options as to how ChromeOS can be leveraged to interact with people on the devices, and formats th...
First generation hyperconverged solutions have taken the data center by storm, rapidly proliferating in pockets everywhere to provide further consolidation of floor space and workloads. These first generation solutions are not without challenges, however. In his session at 21st Cloud Expo, Wes Talbert, a Principal Architect and results-driven enterprise sales leader at NetApp, will discuss how the HCI solution of tomorrow will integrate with the public cloud to deliver a quality hybrid cloud e...
SYS-CON Events announced today that Yuasa System will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Yuasa System is introducing a multi-purpose endurance testing system for flexible displays, OLED devices, flexible substrates, flat cables, and films in smartphones, wearables, automobiles, and healthcare.
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.