|By Adam Vincent||
|December 10, 2010 09:24 AM EST||
The 9/11 Commission Report cited "pervasive problems of managing and sharing information across a large and unwieldy government that had been built in a different era to confront different dangers". Since 9/11 governments around the world have considerably adjusted their stance on information-sharing to allow more adequate and timely sharing of information. Unfortunately, the need to share information quickly in many situations had priority over the need to protect it and this left security policies, certification and accreditation practices, and existing security controls behind.
WikiLeaks may jeopardize all we've worked towards to enhance information sharing, and impede pursuits to make information-sharing more effective. Or it may serve as a wakeup call that our current policies, processes and solutions are not adequate in today's world where information must be collected, fused, discovered, shared and protected at network speed.
Here at Layer 7, we've been working with government agencies worldwide to support their needs for sharing information more quickly, while introducing a more robust set of access and security controls to allow only those with need-to-know clearance access to privileged information. In the following paragraphs, I'm going to discuss how Layer 7 Technologies aids in breaking down information-sharing silos while maintaining a high degree of information protection, control and tracking.
There are multiple efforts underway across government agencies to use digital policy to control who gets access to what information when, as opposed to relying on a written policy. Layer 7's policy-oriented controls allow for digital policy to be defined and enforced across distributed information silos. Either inside an enterprise or in the cloud, using Layer 7,government agencies and commercial entities can define and enforce rules for information discovery, retrieval and dissemination across a variety of security realms and boundaries. With the right kind of policy controls, companies can avoid a WikiLeak of their own.
Layer 7 provides information plumbing for the new IT reality. Using Layer 7 products organizations can ensure:
Data Exfiltration –The WikiLeaks scandal broke because of a single user’s ability to discover, collect and exfiltrate massive quantities of information, much of which was not needed for the day-to-day activities of the user. With Layer 7, digital policies can be defined and enforced which put limits on the number of times a single user can retrieve a single type of data or multiple types of data that, when aggregated together, could be interpreted as having malicious intent. If the user goes beyond his administratively imposed limit, Layer 7 can either allow the operation while notifying administrative or security personnel of the potential issue, or can disallow access altogether while awaiting remediation.
Access Control -The heart of any information system is its ability to grant access to people who meet the "need to know" requirement for accessing the information contained within. The reality with government organizations is that many information systems rely on the user’s level of clearance, the network he is using, or course-grained information likethe branch of service he belongs to, in order to grant or deny access to an information-sharing system in its entirety. For those going beyond the norm with usage of Role Based Access Control (RBAC), the burden of administrating hundreds or thousands users, based on groups, is formidable and limits the effectiveness of the system; it increases the likelihood that the system has authorized users whom no longer have “need to know” of the information.
Layer 7 policy enforcement and decision allows for user authorization through either Attribute Based Access Control (ABAC) or Policy Based Access Control (PBAC). These types of authorizations correlate through policy, attributes about the user, resource and environment in order to allow/deny access. Attributes can be collected from local identity repositories or from enterprise attribute services.
In addition, enterprise attribute services can be federated to allow for attributes to be shared across organizations, thereby minimizing the requirement of having to manage attributes about users from other organizations. An often-overlooked factor of authorization is the need to tie typical authorization policy languages like XACML (is user X allowed to access resource Y) to policies around data exfiltration, data sanitization and transformation, and audit. This is the area where Layer 7 stands out: not only do we have the ability to authorize the user, but we can also enforce a wide variety of policy controls that are integrated with access control.
The following blog posts by Anil John, a colleague whom has specialization in the identity space, provides good information about the benefits and needs of the community in moving from roles to policy and attributes. Policy Based Access Control (PBAC) and Federated Attribute Services
Monitoring, Visibility & Tracking - Even when controls are in place that help mitigate the issue of “need to know,” there will always be a risk of authorized users collecting information within the norms of their current job and role. In support of this, visibility of usage by the individual IT system owner and across enterprise systems is key to limiting this type of event in the future. Layer 7 allows for federation of monitoring data so information about data accesses can be shared with those organizations monitoring the network or enterprise. This allows authentication attempts and valid authorizations to be tracked, and distributed data retrieval trends analyzed on a per user basis across the extended enterprise.
Leakage of privileged information to unauthorized users can never be 100% guaranteed. However, with the simple implementation of a policy-based information control like Layer 7, access to confidential information can be restrictedand tracked.
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Dec. 6, 2016 07:15 AM EST Reads: 1,820
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Dec. 6, 2016 07:15 AM EST Reads: 706
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Dec. 6, 2016 07:00 AM EST Reads: 801
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
Dec. 6, 2016 06:30 AM EST Reads: 1,098
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Dec. 6, 2016 06:15 AM EST Reads: 757
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Dec. 6, 2016 06:00 AM EST Reads: 458
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...
Dec. 6, 2016 05:45 AM EST Reads: 5,290
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
Dec. 6, 2016 05:00 AM EST Reads: 1,645
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 6, 2016 04:45 AM EST Reads: 998
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Dec. 6, 2016 04:30 AM EST Reads: 1,929
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Dec. 6, 2016 04:15 AM EST Reads: 1,753
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 6, 2016 03:45 AM EST Reads: 819
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 6, 2016 01:00 AM EST Reads: 1,848
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 6, 2016 12:45 AM EST Reads: 853
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Dec. 6, 2016 12:30 AM EST Reads: 3,890